Oct 14 02:39:45 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Oct 14 02:39:45 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Oct 14 02:39:45 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Oct 14 02:39:45 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 14 02:39:45 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 14 02:39:45 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 14 02:39:45 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 14 02:39:45 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Oct 14 02:39:45 localhost kernel: signal: max sigframe size: 1776 Oct 14 02:39:45 localhost kernel: BIOS-provided physical RAM map: Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Oct 14 02:39:45 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Oct 14 02:39:45 localhost kernel: NX (Execute Disable) protection: active Oct 14 02:39:45 localhost kernel: SMBIOS 2.8 present. Oct 14 02:39:45 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Oct 14 02:39:45 localhost kernel: Hypervisor detected: KVM Oct 14 02:39:45 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Oct 14 02:39:45 localhost kernel: kvm-clock: using sched offset of 2834669562 cycles Oct 14 02:39:45 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Oct 14 02:39:45 localhost kernel: tsc: Detected 2799.998 MHz processor Oct 14 02:39:45 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Oct 14 02:39:45 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 14 02:39:45 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Oct 14 02:39:45 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Oct 14 02:39:45 localhost kernel: Using GB pages for direct mapping Oct 14 02:39:45 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Oct 14 02:39:45 localhost kernel: ACPI: Early table checksum verification disabled Oct 14 02:39:45 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Oct 14 02:39:45 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 14 02:39:45 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 14 02:39:45 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 14 02:39:45 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Oct 14 02:39:45 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 14 02:39:45 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 14 02:39:45 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Oct 14 02:39:45 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Oct 14 02:39:45 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Oct 14 02:39:45 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Oct 14 02:39:45 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Oct 14 02:39:45 localhost kernel: No NUMA configuration found Oct 14 02:39:45 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Oct 14 02:39:45 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Oct 14 02:39:45 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Oct 14 02:39:45 localhost kernel: Zone ranges: Oct 14 02:39:45 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 14 02:39:45 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Oct 14 02:39:45 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Oct 14 02:39:45 localhost kernel: Device empty Oct 14 02:39:45 localhost kernel: Movable zone start for each node Oct 14 02:39:45 localhost kernel: Early memory node ranges Oct 14 02:39:45 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Oct 14 02:39:45 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Oct 14 02:39:45 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Oct 14 02:39:45 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Oct 14 02:39:45 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 14 02:39:45 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Oct 14 02:39:45 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Oct 14 02:39:45 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Oct 14 02:39:45 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Oct 14 02:39:45 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Oct 14 02:39:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Oct 14 02:39:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Oct 14 02:39:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Oct 14 02:39:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Oct 14 02:39:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Oct 14 02:39:45 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 14 02:39:45 localhost kernel: TSC deadline timer available Oct 14 02:39:45 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Oct 14 02:39:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Oct 14 02:39:45 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Oct 14 02:39:45 localhost kernel: Booting paravirtualized kernel on KVM Oct 14 02:39:45 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 14 02:39:45 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Oct 14 02:39:45 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Oct 14 02:39:45 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Oct 14 02:39:45 localhost kernel: Fallback order for Node 0: 0 Oct 14 02:39:45 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Oct 14 02:39:45 localhost kernel: Policy zone: Normal Oct 14 02:39:45 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Oct 14 02:39:45 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Oct 14 02:39:45 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Oct 14 02:39:45 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Oct 14 02:39:45 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 14 02:39:45 localhost kernel: software IO TLB: area num 8. Oct 14 02:39:45 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Oct 14 02:39:45 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Oct 14 02:39:45 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Oct 14 02:39:45 localhost kernel: ftrace: allocating 44803 entries in 176 pages Oct 14 02:39:45 localhost kernel: ftrace: allocated 176 pages with 3 groups Oct 14 02:39:45 localhost kernel: Dynamic Preempt: voluntary Oct 14 02:39:45 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Oct 14 02:39:45 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Oct 14 02:39:45 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Oct 14 02:39:45 localhost kernel: #011Rude variant of Tasks RCU enabled. Oct 14 02:39:45 localhost kernel: #011Tracing variant of Tasks RCU enabled. Oct 14 02:39:45 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 14 02:39:45 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Oct 14 02:39:45 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Oct 14 02:39:45 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Oct 14 02:39:45 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Oct 14 02:39:45 localhost kernel: random: crng init done (trusting CPU's manufacturer) Oct 14 02:39:45 localhost kernel: Console: colour VGA+ 80x25 Oct 14 02:39:45 localhost kernel: printk: console [tty0] enabled Oct 14 02:39:45 localhost kernel: printk: console [ttyS0] enabled Oct 14 02:39:45 localhost kernel: ACPI: Core revision 20211217 Oct 14 02:39:45 localhost kernel: APIC: Switch to symmetric I/O mode setup Oct 14 02:39:45 localhost kernel: x2apic enabled Oct 14 02:39:45 localhost kernel: Switched APIC routing to physical x2apic. Oct 14 02:39:45 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Oct 14 02:39:45 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Oct 14 02:39:45 localhost kernel: pid_max: default: 32768 minimum: 301 Oct 14 02:39:45 localhost kernel: LSM: Security Framework initializing Oct 14 02:39:45 localhost kernel: Yama: becoming mindful. Oct 14 02:39:45 localhost kernel: SELinux: Initializing. Oct 14 02:39:45 localhost kernel: LSM support for eBPF active Oct 14 02:39:45 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Oct 14 02:39:45 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Oct 14 02:39:45 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Oct 14 02:39:45 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Oct 14 02:39:45 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Oct 14 02:39:45 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 14 02:39:45 localhost kernel: Spectre V2 : Mitigation: Retpolines Oct 14 02:39:45 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Oct 14 02:39:45 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Oct 14 02:39:45 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Oct 14 02:39:45 localhost kernel: RETBleed: Mitigation: untrained return thunk Oct 14 02:39:45 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 14 02:39:45 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 14 02:39:45 localhost kernel: Freeing SMP alternatives memory: 36K Oct 14 02:39:45 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Oct 14 02:39:45 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Oct 14 02:39:45 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Oct 14 02:39:45 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Oct 14 02:39:45 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Oct 14 02:39:45 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Oct 14 02:39:45 localhost kernel: ... version: 0 Oct 14 02:39:45 localhost kernel: ... bit width: 48 Oct 14 02:39:45 localhost kernel: ... generic registers: 6 Oct 14 02:39:45 localhost kernel: ... value mask: 0000ffffffffffff Oct 14 02:39:45 localhost kernel: ... max period: 00007fffffffffff Oct 14 02:39:45 localhost kernel: ... fixed-purpose events: 0 Oct 14 02:39:45 localhost kernel: ... event mask: 000000000000003f Oct 14 02:39:45 localhost kernel: rcu: Hierarchical SRCU implementation. Oct 14 02:39:45 localhost kernel: rcu: #011Max phase no-delay instances is 400. Oct 14 02:39:45 localhost kernel: smp: Bringing up secondary CPUs ... Oct 14 02:39:45 localhost kernel: x86: Booting SMP configuration: Oct 14 02:39:45 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Oct 14 02:39:45 localhost kernel: smp: Brought up 1 node, 8 CPUs Oct 14 02:39:45 localhost kernel: smpboot: Max logical packages: 8 Oct 14 02:39:45 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Oct 14 02:39:45 localhost kernel: node 0 deferred pages initialised in 20ms Oct 14 02:39:45 localhost kernel: devtmpfs: initialized Oct 14 02:39:45 localhost kernel: x86/mm: Memory block size: 128MB Oct 14 02:39:45 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 14 02:39:45 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Oct 14 02:39:45 localhost kernel: pinctrl core: initialized pinctrl subsystem Oct 14 02:39:45 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 14 02:39:45 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Oct 14 02:39:45 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Oct 14 02:39:45 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Oct 14 02:39:45 localhost kernel: audit: initializing netlink subsys (disabled) Oct 14 02:39:45 localhost kernel: audit: type=2000 audit(1760423984.136:1): state=initialized audit_enabled=0 res=1 Oct 14 02:39:45 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Oct 14 02:39:45 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 14 02:39:45 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Oct 14 02:39:45 localhost kernel: cpuidle: using governor menu Oct 14 02:39:45 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Oct 14 02:39:45 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 14 02:39:45 localhost kernel: PCI: Using configuration type 1 for base access Oct 14 02:39:45 localhost kernel: PCI: Using configuration type 1 for extended access Oct 14 02:39:45 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 14 02:39:45 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Oct 14 02:39:45 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Oct 14 02:39:45 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Oct 14 02:39:45 localhost kernel: cryptd: max_cpu_qlen set to 1000 Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(Module Device) Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(Processor Device) Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Oct 14 02:39:45 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Oct 14 02:39:45 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Oct 14 02:39:45 localhost kernel: ACPI: Interpreter enabled Oct 14 02:39:45 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Oct 14 02:39:45 localhost kernel: ACPI: Using IOAPIC for interrupt routing Oct 14 02:39:45 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 14 02:39:45 localhost kernel: PCI: Using E820 reservations for host bridge windows Oct 14 02:39:45 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Oct 14 02:39:45 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Oct 14 02:39:45 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Oct 14 02:39:45 localhost kernel: acpiphp: Slot [3] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [4] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [5] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [6] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [7] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [8] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [9] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [10] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [11] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [12] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [13] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [14] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [15] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [16] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [17] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [18] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [19] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [20] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [21] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [22] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [23] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [24] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [25] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [26] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [27] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [28] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [29] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [30] registered Oct 14 02:39:45 localhost kernel: acpiphp: Slot [31] registered Oct 14 02:39:45 localhost kernel: PCI host bridge to bus 0000:00 Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Oct 14 02:39:45 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Oct 14 02:39:45 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Oct 14 02:39:45 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Oct 14 02:39:45 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Oct 14 02:39:45 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Oct 14 02:39:45 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Oct 14 02:39:45 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 14 02:39:45 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Oct 14 02:39:45 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Oct 14 02:39:45 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Oct 14 02:39:45 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Oct 14 02:39:45 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Oct 14 02:39:45 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Oct 14 02:39:45 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Oct 14 02:39:45 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Oct 14 02:39:45 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Oct 14 02:39:45 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Oct 14 02:39:45 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Oct 14 02:39:45 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Oct 14 02:39:45 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Oct 14 02:39:45 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Oct 14 02:39:45 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Oct 14 02:39:45 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Oct 14 02:39:45 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Oct 14 02:39:45 localhost kernel: iommu: Default domain type: Translated Oct 14 02:39:45 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 14 02:39:45 localhost kernel: SCSI subsystem initialized Oct 14 02:39:45 localhost kernel: ACPI: bus type USB registered Oct 14 02:39:45 localhost kernel: usbcore: registered new interface driver usbfs Oct 14 02:39:45 localhost kernel: usbcore: registered new interface driver hub Oct 14 02:39:45 localhost kernel: usbcore: registered new device driver usb Oct 14 02:39:45 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Oct 14 02:39:45 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Oct 14 02:39:45 localhost kernel: PTP clock support registered Oct 14 02:39:45 localhost kernel: EDAC MC: Ver: 3.0.0 Oct 14 02:39:45 localhost kernel: NetLabel: Initializing Oct 14 02:39:45 localhost kernel: NetLabel: domain hash size = 128 Oct 14 02:39:45 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Oct 14 02:39:45 localhost kernel: NetLabel: unlabeled traffic allowed by default Oct 14 02:39:45 localhost kernel: PCI: Using ACPI for IRQ routing Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Oct 14 02:39:45 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 14 02:39:45 localhost kernel: vgaarb: loaded Oct 14 02:39:45 localhost kernel: clocksource: Switched to clocksource kvm-clock Oct 14 02:39:45 localhost kernel: VFS: Disk quotas dquot_6.6.0 Oct 14 02:39:45 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 14 02:39:45 localhost kernel: pnp: PnP ACPI init Oct 14 02:39:45 localhost kernel: pnp: PnP ACPI: found 5 devices Oct 14 02:39:45 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 14 02:39:45 localhost kernel: NET: Registered PF_INET protocol family Oct 14 02:39:45 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 14 02:39:45 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Oct 14 02:39:45 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 14 02:39:45 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Oct 14 02:39:45 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Oct 14 02:39:45 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Oct 14 02:39:45 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Oct 14 02:39:45 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Oct 14 02:39:45 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Oct 14 02:39:45 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 14 02:39:45 localhost kernel: NET: Registered PF_XDP protocol family Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Oct 14 02:39:45 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Oct 14 02:39:45 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Oct 14 02:39:45 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Oct 14 02:39:45 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Oct 14 02:39:45 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 28123 usecs Oct 14 02:39:45 localhost kernel: PCI: CLS 0 bytes, default 64 Oct 14 02:39:45 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Oct 14 02:39:45 localhost kernel: Trying to unpack rootfs image as initramfs... Oct 14 02:39:45 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Oct 14 02:39:45 localhost kernel: ACPI: bus type thunderbolt registered Oct 14 02:39:45 localhost kernel: Initialise system trusted keyrings Oct 14 02:39:45 localhost kernel: Key type blacklist registered Oct 14 02:39:45 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Oct 14 02:39:45 localhost kernel: zbud: loaded Oct 14 02:39:45 localhost kernel: integrity: Platform Keyring initialized Oct 14 02:39:45 localhost kernel: NET: Registered PF_ALG protocol family Oct 14 02:39:45 localhost kernel: xor: automatically using best checksumming function avx Oct 14 02:39:45 localhost kernel: Key type asymmetric registered Oct 14 02:39:45 localhost kernel: Asymmetric key parser 'x509' registered Oct 14 02:39:45 localhost kernel: Running certificate verification selftests Oct 14 02:39:45 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Oct 14 02:39:45 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Oct 14 02:39:45 localhost kernel: io scheduler mq-deadline registered Oct 14 02:39:45 localhost kernel: io scheduler kyber registered Oct 14 02:39:45 localhost kernel: io scheduler bfq registered Oct 14 02:39:45 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Oct 14 02:39:45 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Oct 14 02:39:45 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Oct 14 02:39:45 localhost kernel: ACPI: button: Power Button [PWRF] Oct 14 02:39:45 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Oct 14 02:39:45 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Oct 14 02:39:45 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Oct 14 02:39:45 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 14 02:39:45 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 14 02:39:45 localhost kernel: Non-volatile memory driver v1.3 Oct 14 02:39:45 localhost kernel: rdac: device handler registered Oct 14 02:39:45 localhost kernel: hp_sw: device handler registered Oct 14 02:39:45 localhost kernel: emc: device handler registered Oct 14 02:39:45 localhost kernel: alua: device handler registered Oct 14 02:39:45 localhost kernel: libphy: Fixed MDIO Bus: probed Oct 14 02:39:45 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Oct 14 02:39:45 localhost kernel: ehci-pci: EHCI PCI platform driver Oct 14 02:39:45 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Oct 14 02:39:45 localhost kernel: ohci-pci: OHCI PCI platform driver Oct 14 02:39:45 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Oct 14 02:39:45 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Oct 14 02:39:45 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Oct 14 02:39:45 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Oct 14 02:39:45 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Oct 14 02:39:45 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Oct 14 02:39:45 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Oct 14 02:39:45 localhost kernel: usb usb1: Product: UHCI Host Controller Oct 14 02:39:45 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Oct 14 02:39:45 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Oct 14 02:39:45 localhost kernel: hub 1-0:1.0: USB hub found Oct 14 02:39:45 localhost kernel: hub 1-0:1.0: 2 ports detected Oct 14 02:39:45 localhost kernel: usbcore: registered new interface driver usbserial_generic Oct 14 02:39:45 localhost kernel: usbserial: USB Serial support registered for generic Oct 14 02:39:45 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Oct 14 02:39:45 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Oct 14 02:39:45 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Oct 14 02:39:45 localhost kernel: mousedev: PS/2 mouse device common for all mice Oct 14 02:39:45 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Oct 14 02:39:45 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Oct 14 02:39:45 localhost kernel: rtc_cmos 00:04: registered as rtc0 Oct 14 02:39:45 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-14T06:39:44 UTC (1760423984) Oct 14 02:39:45 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Oct 14 02:39:45 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Oct 14 02:39:45 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Oct 14 02:39:45 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Oct 14 02:39:45 localhost kernel: usbcore: registered new interface driver usbhid Oct 14 02:39:45 localhost kernel: usbhid: USB HID core driver Oct 14 02:39:45 localhost kernel: drop_monitor: Initializing network drop monitor service Oct 14 02:39:45 localhost kernel: Initializing XFRM netlink socket Oct 14 02:39:45 localhost kernel: NET: Registered PF_INET6 protocol family Oct 14 02:39:45 localhost kernel: Segment Routing with IPv6 Oct 14 02:39:45 localhost kernel: NET: Registered PF_PACKET protocol family Oct 14 02:39:45 localhost kernel: mpls_gso: MPLS GSO support Oct 14 02:39:45 localhost kernel: IPI shorthand broadcast: enabled Oct 14 02:39:45 localhost kernel: AVX2 version of gcm_enc/dec engaged. Oct 14 02:39:45 localhost kernel: AES CTR mode by8 optimization enabled Oct 14 02:39:45 localhost kernel: sched_clock: Marking stable (736731841, 178171774)->(1044071002, -129167387) Oct 14 02:39:45 localhost kernel: registered taskstats version 1 Oct 14 02:39:45 localhost kernel: Loading compiled-in X.509 certificates Oct 14 02:39:45 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Oct 14 02:39:45 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Oct 14 02:39:45 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Oct 14 02:39:45 localhost kernel: zswap: loaded using pool lzo/zbud Oct 14 02:39:45 localhost kernel: page_owner is disabled Oct 14 02:39:45 localhost kernel: Key type big_key registered Oct 14 02:39:45 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Oct 14 02:39:45 localhost kernel: Freeing initrd memory: 74232K Oct 14 02:39:45 localhost kernel: Key type encrypted registered Oct 14 02:39:45 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Oct 14 02:39:45 localhost kernel: Loading compiled-in module X.509 certificates Oct 14 02:39:45 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Oct 14 02:39:45 localhost kernel: ima: Allocated hash algorithm: sha256 Oct 14 02:39:45 localhost kernel: ima: No architecture policies found Oct 14 02:39:45 localhost kernel: evm: Initialising EVM extended attributes: Oct 14 02:39:45 localhost kernel: evm: security.selinux Oct 14 02:39:45 localhost kernel: evm: security.SMACK64 (disabled) Oct 14 02:39:45 localhost kernel: evm: security.SMACK64EXEC (disabled) Oct 14 02:39:45 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Oct 14 02:39:45 localhost kernel: evm: security.SMACK64MMAP (disabled) Oct 14 02:39:45 localhost kernel: evm: security.apparmor (disabled) Oct 14 02:39:45 localhost kernel: evm: security.ima Oct 14 02:39:45 localhost kernel: evm: security.capability Oct 14 02:39:45 localhost kernel: evm: HMAC attrs: 0x1 Oct 14 02:39:45 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Oct 14 02:39:45 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Oct 14 02:39:45 localhost kernel: usb 1-1: Product: QEMU USB Tablet Oct 14 02:39:45 localhost kernel: usb 1-1: Manufacturer: QEMU Oct 14 02:39:45 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Oct 14 02:39:45 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Oct 14 02:39:45 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Oct 14 02:39:45 localhost kernel: Freeing unused decrypted memory: 2036K Oct 14 02:39:45 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Oct 14 02:39:45 localhost kernel: Write protecting the kernel read-only data: 26624k Oct 14 02:39:45 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Oct 14 02:39:45 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Oct 14 02:39:45 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Oct 14 02:39:45 localhost kernel: Run /init as init process Oct 14 02:39:45 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Oct 14 02:39:45 localhost systemd[1]: Detected virtualization kvm. Oct 14 02:39:45 localhost systemd[1]: Detected architecture x86-64. Oct 14 02:39:45 localhost systemd[1]: Running in initrd. Oct 14 02:39:45 localhost systemd[1]: No hostname configured, using default hostname. Oct 14 02:39:45 localhost systemd[1]: Hostname set to . Oct 14 02:39:45 localhost systemd[1]: Initializing machine ID from VM UUID. Oct 14 02:39:45 localhost systemd[1]: Queued start job for default target Initrd Default Target. Oct 14 02:39:45 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Oct 14 02:39:45 localhost systemd[1]: Reached target Local Encrypted Volumes. Oct 14 02:39:45 localhost systemd[1]: Reached target Initrd /usr File System. Oct 14 02:39:45 localhost systemd[1]: Reached target Local File Systems. Oct 14 02:39:45 localhost systemd[1]: Reached target Path Units. Oct 14 02:39:45 localhost systemd[1]: Reached target Slice Units. Oct 14 02:39:45 localhost systemd[1]: Reached target Swaps. Oct 14 02:39:45 localhost systemd[1]: Reached target Timer Units. Oct 14 02:39:45 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Oct 14 02:39:45 localhost systemd[1]: Listening on Journal Socket (/dev/log). Oct 14 02:39:45 localhost systemd[1]: Listening on Journal Socket. Oct 14 02:39:45 localhost systemd[1]: Listening on udev Control Socket. Oct 14 02:39:45 localhost systemd[1]: Listening on udev Kernel Socket. Oct 14 02:39:45 localhost systemd[1]: Reached target Socket Units. Oct 14 02:39:45 localhost systemd[1]: Starting Create List of Static Device Nodes... Oct 14 02:39:45 localhost systemd[1]: Starting Journal Service... Oct 14 02:39:45 localhost systemd[1]: Starting Load Kernel Modules... Oct 14 02:39:45 localhost systemd[1]: Starting Create System Users... Oct 14 02:39:45 localhost systemd[1]: Starting Setup Virtual Console... Oct 14 02:39:45 localhost systemd[1]: Finished Create List of Static Device Nodes. Oct 14 02:39:45 localhost systemd[1]: Finished Load Kernel Modules. Oct 14 02:39:45 localhost systemd[1]: Starting Apply Kernel Variables... Oct 14 02:39:45 localhost systemd-journald[282]: Journal started Oct 14 02:39:45 localhost systemd-journald[282]: Runtime Journal (/run/log/journal/1e17686ee9d94f56ae5be175ec048439) is 8.0M, max 314.7M, 306.7M free. Oct 14 02:39:45 localhost systemd-modules-load[283]: Module 'msr' is built in Oct 14 02:39:45 localhost systemd[1]: Started Journal Service. Oct 14 02:39:45 localhost systemd[1]: Finished Setup Virtual Console. Oct 14 02:39:45 localhost systemd[1]: Finished Apply Kernel Variables. Oct 14 02:39:45 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Oct 14 02:39:45 localhost systemd[1]: Starting dracut cmdline hook... Oct 14 02:39:45 localhost systemd-sysusers[284]: Creating group 'sgx' with GID 997. Oct 14 02:39:45 localhost systemd-sysusers[284]: Creating group 'users' with GID 100. Oct 14 02:39:45 localhost systemd-sysusers[284]: Creating group 'dbus' with GID 81. Oct 14 02:39:45 localhost systemd-sysusers[284]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Oct 14 02:39:45 localhost systemd[1]: Finished Create System Users. Oct 14 02:39:45 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Oct 14 02:39:45 localhost systemd[1]: Starting Create Volatile Files and Directories... Oct 14 02:39:45 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Oct 14 02:39:45 localhost dracut-cmdline[289]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Oct 14 02:39:45 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Oct 14 02:39:45 localhost systemd[1]: Finished Create Volatile Files and Directories. Oct 14 02:39:45 localhost systemd[1]: Finished dracut cmdline hook. Oct 14 02:39:45 localhost systemd[1]: Starting dracut pre-udev hook... Oct 14 02:39:45 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 14 02:39:45 localhost kernel: device-mapper: uevent: version 1.0.3 Oct 14 02:39:45 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Oct 14 02:39:45 localhost kernel: RPC: Registered named UNIX socket transport module. Oct 14 02:39:45 localhost kernel: RPC: Registered udp transport module. Oct 14 02:39:45 localhost kernel: RPC: Registered tcp transport module. Oct 14 02:39:45 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Oct 14 02:39:45 localhost rpc.statd[406]: Version 2.5.4 starting Oct 14 02:39:45 localhost rpc.statd[406]: Initializing NSM state Oct 14 02:39:45 localhost rpc.idmapd[411]: Setting log level to 0 Oct 14 02:39:45 localhost systemd[1]: Finished dracut pre-udev hook. Oct 14 02:39:45 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Oct 14 02:39:45 localhost systemd-udevd[424]: Using default interface naming scheme 'rhel-9.0'. Oct 14 02:39:45 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Oct 14 02:39:45 localhost systemd[1]: Starting dracut pre-trigger hook... Oct 14 02:39:45 localhost systemd[1]: Finished dracut pre-trigger hook. Oct 14 02:39:45 localhost systemd[1]: Starting Coldplug All udev Devices... Oct 14 02:39:45 localhost systemd[1]: Finished Coldplug All udev Devices. Oct 14 02:39:45 localhost systemd[1]: Reached target System Initialization. Oct 14 02:39:45 localhost systemd[1]: Reached target Basic System. Oct 14 02:39:45 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Oct 14 02:39:45 localhost systemd[1]: Reached target Network. Oct 14 02:39:45 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Oct 14 02:39:45 localhost systemd[1]: Starting dracut initqueue hook... Oct 14 02:39:45 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Oct 14 02:39:45 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Oct 14 02:39:45 localhost kernel: GPT:20971519 != 838860799 Oct 14 02:39:45 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Oct 14 02:39:45 localhost kernel: GPT:20971519 != 838860799 Oct 14 02:39:45 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Oct 14 02:39:45 localhost kernel: vda: vda1 vda2 vda3 vda4 Oct 14 02:39:45 localhost systemd-udevd[445]: Network interface NamePolicy= disabled on kernel command line. Oct 14 02:39:45 localhost kernel: scsi host0: ata_piix Oct 14 02:39:45 localhost kernel: scsi host1: ata_piix Oct 14 02:39:45 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Oct 14 02:39:45 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Oct 14 02:39:45 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Oct 14 02:39:45 localhost systemd[1]: Reached target Initrd Root Device. Oct 14 02:39:46 localhost kernel: ata1: found unknown device (class 0) Oct 14 02:39:46 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Oct 14 02:39:46 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Oct 14 02:39:46 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Oct 14 02:39:46 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Oct 14 02:39:46 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Oct 14 02:39:46 localhost systemd[1]: Finished dracut initqueue hook. Oct 14 02:39:46 localhost systemd[1]: Reached target Preparation for Remote File Systems. Oct 14 02:39:46 localhost systemd[1]: Reached target Remote Encrypted Volumes. Oct 14 02:39:46 localhost systemd[1]: Reached target Remote File Systems. Oct 14 02:39:46 localhost systemd[1]: Starting dracut pre-mount hook... Oct 14 02:39:46 localhost systemd[1]: Finished dracut pre-mount hook. Oct 14 02:39:46 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Oct 14 02:39:46 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system. Oct 14 02:39:46 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Oct 14 02:39:46 localhost systemd[1]: Mounting /sysroot... Oct 14 02:39:46 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Oct 14 02:39:46 localhost kernel: XFS (vda4): Mounting V5 Filesystem Oct 14 02:39:46 localhost kernel: XFS (vda4): Ending clean mount Oct 14 02:39:46 localhost systemd[1]: Mounted /sysroot. Oct 14 02:39:46 localhost systemd[1]: Reached target Initrd Root File System. Oct 14 02:39:46 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Oct 14 02:39:46 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Oct 14 02:39:46 localhost systemd[1]: Reached target Initrd File Systems. Oct 14 02:39:46 localhost systemd[1]: Reached target Initrd Default Target. Oct 14 02:39:46 localhost systemd[1]: Starting dracut mount hook... Oct 14 02:39:46 localhost systemd[1]: Finished dracut mount hook. Oct 14 02:39:46 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Oct 14 02:39:46 localhost rpc.idmapd[411]: exiting on signal 15 Oct 14 02:39:46 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Oct 14 02:39:46 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Oct 14 02:39:46 localhost systemd[1]: Stopped target Network. Oct 14 02:39:46 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Oct 14 02:39:46 localhost systemd[1]: Stopped target Timer Units. Oct 14 02:39:46 localhost systemd[1]: dbus.socket: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Oct 14 02:39:46 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Oct 14 02:39:46 localhost systemd[1]: Stopped target Initrd Default Target. Oct 14 02:39:46 localhost systemd[1]: Stopped target Basic System. Oct 14 02:39:46 localhost systemd[1]: Stopped target Initrd Root Device. Oct 14 02:39:46 localhost systemd[1]: Stopped target Initrd /usr File System. Oct 14 02:39:46 localhost systemd[1]: Stopped target Path Units. Oct 14 02:39:46 localhost systemd[1]: Stopped target Remote File Systems. Oct 14 02:39:46 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Oct 14 02:39:46 localhost systemd[1]: Stopped target Slice Units. Oct 14 02:39:46 localhost systemd[1]: Stopped target Socket Units. Oct 14 02:39:46 localhost systemd[1]: Stopped target System Initialization. Oct 14 02:39:46 localhost systemd[1]: Stopped target Local File Systems. Oct 14 02:39:46 localhost systemd[1]: Stopped target Swaps. Oct 14 02:39:46 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut mount hook. Oct 14 02:39:46 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut pre-mount hook. Oct 14 02:39:46 localhost systemd[1]: Stopped target Local Encrypted Volumes. Oct 14 02:39:46 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Oct 14 02:39:46 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut initqueue hook. Oct 14 02:39:46 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Apply Kernel Variables. Oct 14 02:39:46 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Load Kernel Modules. Oct 14 02:39:46 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Create Volatile Files and Directories. Oct 14 02:39:46 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Coldplug All udev Devices. Oct 14 02:39:46 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut pre-trigger hook. Oct 14 02:39:46 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Oct 14 02:39:46 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Setup Virtual Console. Oct 14 02:39:46 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Oct 14 02:39:46 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Oct 14 02:39:46 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Closed udev Control Socket. Oct 14 02:39:46 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Closed udev Kernel Socket. Oct 14 02:39:46 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut pre-udev hook. Oct 14 02:39:46 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped dracut cmdline hook. Oct 14 02:39:46 localhost systemd[1]: Starting Cleanup udev Database... Oct 14 02:39:46 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Oct 14 02:39:46 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Create List of Static Device Nodes. Oct 14 02:39:46 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Stopped Create System Users. Oct 14 02:39:46 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 14 02:39:46 localhost systemd[1]: Finished Cleanup udev Database. Oct 14 02:39:46 localhost systemd[1]: Reached target Switch Root. Oct 14 02:39:46 localhost systemd[1]: Starting Switch Root... Oct 14 02:39:46 localhost systemd[1]: Switching root. Oct 14 02:39:46 localhost systemd-journald[282]: Journal stopped Oct 14 02:39:47 localhost systemd-journald[282]: Received SIGTERM from PID 1 (systemd). Oct 14 02:39:47 localhost kernel: audit: type=1404 audit(1760423987.078:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Oct 14 02:39:47 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 02:39:47 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 02:39:47 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 02:39:47 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 02:39:47 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 02:39:47 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 02:39:47 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 02:39:47 localhost kernel: audit: type=1403 audit(1760423987.210:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 14 02:39:47 localhost systemd[1]: Successfully loaded SELinux policy in 135.147ms. Oct 14 02:39:47 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 33.292ms. Oct 14 02:39:47 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Oct 14 02:39:47 localhost systemd[1]: Detected virtualization kvm. Oct 14 02:39:47 localhost systemd[1]: Detected architecture x86-64. Oct 14 02:39:47 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 02:39:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 02:39:47 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 14 02:39:47 localhost systemd[1]: Stopped Switch Root. Oct 14 02:39:47 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 14 02:39:47 localhost systemd[1]: Created slice Slice /system/getty. Oct 14 02:39:47 localhost systemd[1]: Created slice Slice /system/modprobe. Oct 14 02:39:47 localhost systemd[1]: Created slice Slice /system/serial-getty. Oct 14 02:39:47 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Oct 14 02:39:47 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Oct 14 02:39:47 localhost systemd[1]: Created slice User and Session Slice. Oct 14 02:39:47 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Oct 14 02:39:47 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Oct 14 02:39:47 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Oct 14 02:39:47 localhost systemd[1]: Reached target Local Encrypted Volumes. Oct 14 02:39:47 localhost systemd[1]: Stopped target Switch Root. Oct 14 02:39:47 localhost systemd[1]: Stopped target Initrd File Systems. Oct 14 02:39:47 localhost systemd[1]: Stopped target Initrd Root File System. Oct 14 02:39:47 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Oct 14 02:39:47 localhost systemd[1]: Reached target Path Units. Oct 14 02:39:47 localhost systemd[1]: Reached target rpc_pipefs.target. Oct 14 02:39:47 localhost systemd[1]: Reached target Slice Units. Oct 14 02:39:47 localhost systemd[1]: Reached target Swaps. Oct 14 02:39:47 localhost systemd[1]: Reached target Local Verity Protected Volumes. Oct 14 02:39:47 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Oct 14 02:39:47 localhost systemd[1]: Reached target RPC Port Mapper. Oct 14 02:39:47 localhost systemd[1]: Listening on Process Core Dump Socket. Oct 14 02:39:47 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Oct 14 02:39:47 localhost systemd[1]: Listening on udev Control Socket. Oct 14 02:39:47 localhost systemd[1]: Listening on udev Kernel Socket. Oct 14 02:39:47 localhost systemd[1]: Mounting Huge Pages File System... Oct 14 02:39:47 localhost systemd[1]: Mounting POSIX Message Queue File System... Oct 14 02:39:47 localhost systemd[1]: Mounting Kernel Debug File System... Oct 14 02:39:47 localhost systemd[1]: Mounting Kernel Trace File System... Oct 14 02:39:47 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Oct 14 02:39:47 localhost systemd[1]: Starting Create List of Static Device Nodes... Oct 14 02:39:47 localhost systemd[1]: Starting Load Kernel Module configfs... Oct 14 02:39:47 localhost systemd[1]: Starting Load Kernel Module drm... Oct 14 02:39:47 localhost systemd[1]: Starting Load Kernel Module fuse... Oct 14 02:39:47 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Oct 14 02:39:47 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 14 02:39:47 localhost systemd[1]: Stopped File System Check on Root Device. Oct 14 02:39:47 localhost systemd[1]: Stopped Journal Service. Oct 14 02:39:47 localhost systemd[1]: Starting Journal Service... Oct 14 02:39:47 localhost systemd[1]: Starting Load Kernel Modules... Oct 14 02:39:47 localhost systemd[1]: Starting Generate network units from Kernel command line... Oct 14 02:39:47 localhost kernel: fuse: init (API version 7.36) Oct 14 02:39:47 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Oct 14 02:39:47 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Oct 14 02:39:47 localhost systemd[1]: Starting Coldplug All udev Devices... Oct 14 02:39:47 localhost systemd-journald[618]: Journal started Oct 14 02:39:47 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/8e1d5208cffec42b50976967e1d1cfd0) is 8.0M, max 314.7M, 306.7M free. Oct 14 02:39:47 localhost systemd[1]: Queued start job for default target Multi-User System. Oct 14 02:39:47 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Oct 14 02:39:47 localhost systemd-modules-load[619]: Module 'msr' is built in Oct 14 02:39:47 localhost systemd[1]: Started Journal Service. Oct 14 02:39:47 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Oct 14 02:39:47 localhost systemd[1]: Mounted Huge Pages File System. Oct 14 02:39:47 localhost systemd[1]: Mounted POSIX Message Queue File System. Oct 14 02:39:47 localhost systemd[1]: Mounted Kernel Debug File System. Oct 14 02:39:47 localhost systemd[1]: Mounted Kernel Trace File System. Oct 14 02:39:47 localhost systemd[1]: Finished Create List of Static Device Nodes. Oct 14 02:39:47 localhost kernel: ACPI: bus type drm_connector registered Oct 14 02:39:47 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 14 02:39:47 localhost systemd[1]: Finished Load Kernel Module configfs. Oct 14 02:39:47 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 14 02:39:47 localhost systemd[1]: Finished Load Kernel Module drm. Oct 14 02:39:47 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 14 02:39:47 localhost systemd[1]: Finished Load Kernel Module fuse. Oct 14 02:39:47 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Oct 14 02:39:48 localhost systemd[1]: Finished Load Kernel Modules. Oct 14 02:39:48 localhost systemd[1]: Finished Generate network units from Kernel command line. Oct 14 02:39:48 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Oct 14 02:39:48 localhost systemd[1]: Mounting FUSE Control File System... Oct 14 02:39:48 localhost systemd[1]: Mounting Kernel Configuration File System... Oct 14 02:39:48 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Oct 14 02:39:48 localhost systemd[1]: Starting Rebuild Hardware Database... Oct 14 02:39:48 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Oct 14 02:39:48 localhost systemd[1]: Starting Load/Save Random Seed... Oct 14 02:39:48 localhost systemd[1]: Starting Apply Kernel Variables... Oct 14 02:39:48 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/8e1d5208cffec42b50976967e1d1cfd0) is 8.0M, max 314.7M, 306.7M free. Oct 14 02:39:48 localhost systemd-journald[618]: Received client request to flush runtime journal. Oct 14 02:39:48 localhost systemd[1]: Starting Create System Users... Oct 14 02:39:48 localhost systemd[1]: Mounted FUSE Control File System. Oct 14 02:39:48 localhost systemd[1]: Finished Coldplug All udev Devices. Oct 14 02:39:48 localhost systemd[1]: Mounted Kernel Configuration File System. Oct 14 02:39:48 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Oct 14 02:39:48 localhost systemd[1]: Finished Apply Kernel Variables. Oct 14 02:39:48 localhost systemd[1]: Finished Load/Save Random Seed. Oct 14 02:39:48 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Oct 14 02:39:48 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989. Oct 14 02:39:48 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988. Oct 14 02:39:48 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Oct 14 02:39:48 localhost systemd[1]: Finished Create System Users. Oct 14 02:39:48 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Oct 14 02:39:48 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Oct 14 02:39:48 localhost systemd[1]: Reached target Preparation for Local File Systems. Oct 14 02:39:48 localhost systemd[1]: Set up automount EFI System Partition Automount. Oct 14 02:39:48 localhost systemd[1]: Finished Rebuild Hardware Database. Oct 14 02:39:48 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Oct 14 02:39:48 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Oct 14 02:39:48 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Oct 14 02:39:48 localhost systemd[1]: Starting Load Kernel Module configfs... Oct 14 02:39:48 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 14 02:39:48 localhost systemd[1]: Finished Load Kernel Module configfs. Oct 14 02:39:48 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Oct 14 02:39:48 localhost systemd-udevd[638]: Network interface NamePolicy= disabled on kernel command line. Oct 14 02:39:48 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Oct 14 02:39:48 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Oct 14 02:39:48 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Oct 14 02:39:48 localhost systemd-fsck[679]: fsck.fat 4.2 (2021-01-31) Oct 14 02:39:48 localhost systemd-fsck[679]: /dev/vda2: 12 files, 1782/51145 clusters Oct 14 02:39:48 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Oct 14 02:39:48 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Oct 14 02:39:48 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Oct 14 02:39:48 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Oct 14 02:39:48 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Oct 14 02:39:48 localhost kernel: Console: switching to colour dummy device 80x25 Oct 14 02:39:48 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Oct 14 02:39:48 localhost kernel: [drm] features: -context_init Oct 14 02:39:48 localhost kernel: [drm] number of scanouts: 1 Oct 14 02:39:48 localhost kernel: [drm] number of cap sets: 0 Oct 14 02:39:48 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Oct 14 02:39:48 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Oct 14 02:39:48 localhost kernel: Console: switching to colour frame buffer device 128x48 Oct 14 02:39:48 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Oct 14 02:39:48 localhost kernel: SVM: TSC scaling supported Oct 14 02:39:48 localhost kernel: kvm: Nested Virtualization enabled Oct 14 02:39:48 localhost kernel: SVM: kvm: Nested Paging enabled Oct 14 02:39:48 localhost kernel: SVM: LBR virtualization supported Oct 14 02:39:48 localhost systemd[1]: Mounting /boot... Oct 14 02:39:48 localhost kernel: XFS (vda3): Mounting V5 Filesystem Oct 14 02:39:49 localhost kernel: XFS (vda3): Ending clean mount Oct 14 02:39:49 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Oct 14 02:39:49 localhost systemd[1]: Mounted /boot. Oct 14 02:39:49 localhost systemd[1]: Mounting /boot/efi... Oct 14 02:39:49 localhost systemd[1]: Mounted /boot/efi. Oct 14 02:39:49 localhost systemd[1]: Reached target Local File Systems. Oct 14 02:39:49 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Oct 14 02:39:49 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Oct 14 02:39:49 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 14 02:39:49 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Oct 14 02:39:49 localhost systemd[1]: Starting Automatic Boot Loader Update... Oct 14 02:39:49 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Oct 14 02:39:49 localhost systemd[1]: Starting Create Volatile Files and Directories... Oct 14 02:39:49 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 707 (bootctl) Oct 14 02:39:49 localhost systemd[1]: Starting File System Check on /dev/vda2... Oct 14 02:39:49 localhost systemd[1]: Finished File System Check on /dev/vda2. Oct 14 02:39:49 localhost systemd[1]: Mounting EFI System Partition Automount... Oct 14 02:39:49 localhost systemd[1]: Mounted EFI System Partition Automount. Oct 14 02:39:49 localhost systemd[1]: Finished Automatic Boot Loader Update. Oct 14 02:39:49 localhost systemd[1]: Finished Create Volatile Files and Directories. Oct 14 02:39:49 localhost systemd[1]: Starting Security Auditing Service... Oct 14 02:39:49 localhost systemd[1]: Starting RPC Bind... Oct 14 02:39:49 localhost systemd[1]: Starting Rebuild Journal Catalog... Oct 14 02:39:49 localhost auditd[722]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Oct 14 02:39:49 localhost auditd[722]: Init complete, auditd 3.0.7 listening for events (startup state enable) Oct 14 02:39:49 localhost systemd[1]: Finished Rebuild Journal Catalog. Oct 14 02:39:49 localhost systemd[1]: Started RPC Bind. Oct 14 02:39:49 localhost augenrules[727]: /sbin/augenrules: No change Oct 14 02:39:49 localhost augenrules[741]: No rules Oct 14 02:39:49 localhost augenrules[741]: enabled 1 Oct 14 02:39:49 localhost augenrules[741]: failure 1 Oct 14 02:39:49 localhost augenrules[741]: pid 722 Oct 14 02:39:49 localhost augenrules[741]: rate_limit 0 Oct 14 02:39:49 localhost augenrules[741]: backlog_limit 8192 Oct 14 02:39:49 localhost augenrules[741]: lost 0 Oct 14 02:39:49 localhost augenrules[741]: backlog 2 Oct 14 02:39:49 localhost augenrules[741]: backlog_wait_time 60000 Oct 14 02:39:49 localhost augenrules[741]: backlog_wait_time_actual 0 Oct 14 02:39:49 localhost augenrules[741]: enabled 1 Oct 14 02:39:49 localhost augenrules[741]: failure 1 Oct 14 02:39:49 localhost augenrules[741]: pid 722 Oct 14 02:39:49 localhost augenrules[741]: rate_limit 0 Oct 14 02:39:49 localhost augenrules[741]: backlog_limit 8192 Oct 14 02:39:49 localhost augenrules[741]: lost 0 Oct 14 02:39:49 localhost augenrules[741]: backlog 0 Oct 14 02:39:49 localhost augenrules[741]: backlog_wait_time 60000 Oct 14 02:39:49 localhost augenrules[741]: backlog_wait_time_actual 0 Oct 14 02:39:49 localhost augenrules[741]: enabled 1 Oct 14 02:39:49 localhost augenrules[741]: failure 1 Oct 14 02:39:49 localhost augenrules[741]: pid 722 Oct 14 02:39:49 localhost augenrules[741]: rate_limit 0 Oct 14 02:39:49 localhost augenrules[741]: backlog_limit 8192 Oct 14 02:39:49 localhost augenrules[741]: lost 0 Oct 14 02:39:49 localhost augenrules[741]: backlog 0 Oct 14 02:39:49 localhost augenrules[741]: backlog_wait_time 60000 Oct 14 02:39:49 localhost augenrules[741]: backlog_wait_time_actual 0 Oct 14 02:39:49 localhost systemd[1]: Started Security Auditing Service. Oct 14 02:39:49 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Oct 14 02:39:49 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Oct 14 02:39:49 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Oct 14 02:39:49 localhost systemd[1]: Starting Update is Completed... Oct 14 02:39:49 localhost systemd[1]: Finished Update is Completed. Oct 14 02:39:49 localhost systemd[1]: Reached target System Initialization. Oct 14 02:39:49 localhost systemd[1]: Started dnf makecache --timer. Oct 14 02:39:49 localhost systemd[1]: Started Daily rotation of log files. Oct 14 02:39:49 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Oct 14 02:39:49 localhost systemd[1]: Reached target Timer Units. Oct 14 02:39:49 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Oct 14 02:39:49 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Oct 14 02:39:49 localhost systemd[1]: Reached target Socket Units. Oct 14 02:39:49 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Oct 14 02:39:49 localhost systemd[1]: Starting D-Bus System Message Bus... Oct 14 02:39:49 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Oct 14 02:39:49 localhost systemd[1]: Started D-Bus System Message Bus. Oct 14 02:39:49 localhost systemd[1]: Reached target Basic System. Oct 14 02:39:49 localhost systemd[1]: Starting NTP client/server... Oct 14 02:39:49 localhost journal[751]: Ready Oct 14 02:39:49 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Oct 14 02:39:49 localhost systemd[1]: Started irqbalance daemon. Oct 14 02:39:49 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Oct 14 02:39:49 localhost systemd[1]: Starting System Logging Service... Oct 14 02:39:49 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 02:39:49 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 02:39:49 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 02:39:49 localhost systemd[1]: Reached target sshd-keygen.target. Oct 14 02:39:49 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Oct 14 02:39:49 localhost systemd[1]: Reached target User and Group Name Lookups. Oct 14 02:39:49 localhost systemd[1]: Starting User Login Management... Oct 14 02:39:49 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Oct 14 02:39:49 localhost rsyslogd[759]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="759" x-info="https://www.rsyslog.com"] start Oct 14 02:39:49 localhost rsyslogd[759]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Oct 14 02:39:49 localhost systemd[1]: Started System Logging Service. Oct 14 02:39:49 localhost chronyd[766]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Oct 14 02:39:49 localhost chronyd[766]: Using right/UTC timezone to obtain leap second data Oct 14 02:39:49 localhost chronyd[766]: Loaded seccomp filter (level 2) Oct 14 02:39:49 localhost systemd[1]: Started NTP client/server. Oct 14 02:39:49 localhost systemd-logind[760]: New seat seat0. Oct 14 02:39:49 localhost systemd-logind[760]: Watching system buttons on /dev/input/event0 (Power Button) Oct 14 02:39:49 localhost systemd-logind[760]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Oct 14 02:39:49 localhost systemd[1]: Started User Login Management. Oct 14 02:39:49 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 02:39:50 localhost cloud-init[770]: Cloud-init v. 22.1-9.el9 running 'init-local' at Tue, 14 Oct 2025 06:39:50 +0000. Up 6.21 seconds. Oct 14 02:39:50 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp4ttalpm2.mount: Deactivated successfully. Oct 14 02:39:50 localhost systemd[1]: Starting Hostname Service... Oct 14 02:39:50 localhost systemd[1]: Started Hostname Service. Oct 14 02:39:50 localhost systemd-hostnamed[784]: Hostname set to (static) Oct 14 02:39:50 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Oct 14 02:39:50 localhost systemd[1]: Reached target Preparation for Network. Oct 14 02:39:50 localhost systemd[1]: Starting Network Manager... Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6128] NetworkManager (version 1.42.2-1.el9) is starting... (boot:6bfe016f-8d50-419d-b828-da4460617f42) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6136] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Oct 14 02:39:50 localhost systemd[1]: Started Network Manager. Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6179] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Oct 14 02:39:50 localhost systemd[1]: Reached target Network. Oct 14 02:39:50 localhost systemd[1]: Starting Network Manager Wait Online... Oct 14 02:39:50 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6271] manager[0x5555fc016020]: monitoring kernel firmware directory '/lib/firmware'. Oct 14 02:39:50 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Oct 14 02:39:50 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Oct 14 02:39:50 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6364] hostname: hostname: using hostnamed Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6364] hostname: static hostname changed from (none) to "np0005486733.novalocal" Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6374] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Oct 14 02:39:50 localhost systemd[1]: Started GSSAPI Proxy Daemon. Oct 14 02:39:50 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Oct 14 02:39:50 localhost systemd[1]: Reached target NFS client services. Oct 14 02:39:50 localhost systemd[1]: Reached target Preparation for Remote File Systems. Oct 14 02:39:50 localhost systemd[1]: Reached target Remote File Systems. Oct 14 02:39:50 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6563] manager[0x5555fc016020]: rfkill: Wi-Fi hardware radio set enabled Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6565] manager[0x5555fc016020]: rfkill: WWAN hardware radio set enabled Oct 14 02:39:50 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6627] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6628] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6638] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6639] manager: Networking is enabled by state file Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6675] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6675] settings: Loaded settings plugin: keyfile (internal) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6695] dhcp: init: Using DHCP client 'internal' Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6697] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6707] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6710] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Oct 14 02:39:50 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6715] device (lo): Activation: starting connection 'lo' (eaf496c7-1f18-48c4-bdf3-53eb5b9ead4a) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6721] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6722] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6760] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6762] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6764] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6765] device (eth0): carrier: link connected Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6768] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6772] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6779] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6782] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6782] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6784] manager: NetworkManager state is now CONNECTING Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6786] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6794] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6797] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Oct 14 02:39:50 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6865] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6867] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.6870] device (lo): Activation: successful, device activated. Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7128] dhcp4 (eth0): state changed new lease, address=38.102.83.143 Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7132] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7152] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7172] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7173] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7176] manager: NetworkManager state is now CONNECTED_SITE Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7179] device (eth0): Activation: successful, device activated. Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7183] manager: NetworkManager state is now CONNECTED_GLOBAL Oct 14 02:39:50 localhost NetworkManager[789]: [1760423990.7186] manager: startup complete Oct 14 02:39:50 localhost systemd[1]: Finished Network Manager Wait Online. Oct 14 02:39:50 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Oct 14 02:39:50 localhost cloud-init[917]: Cloud-init v. 22.1-9.el9 running 'init' at Tue, 14 Oct 2025 06:39:50 +0000. Up 7.13 seconds. Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | eth0 | True | 38.102.83.143 | 255.255.255.0 | global | fa:16:3e:99:78:0b | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | eth0 | True | fe80::f816:3eff:fe99:780b/64 | . | link | fa:16:3e:99:78:0b | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | lo | True | ::1/128 | . | host | . | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | Route | Destination | Gateway | Interface | Flags | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+ Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Oct 14 02:39:50 localhost cloud-init[917]: ci-info: | 3 | multicast | :: | eth0 | U | Oct 14 02:39:51 localhost cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+ Oct 14 02:39:51 localhost systemd[1]: Starting Authorization Manager... Oct 14 02:39:51 localhost polkitd[1036]: Started polkitd version 0.117 Oct 14 02:39:51 localhost systemd[1]: Started Dynamic System Tuning Daemon. Oct 14 02:39:51 localhost systemd[1]: Started Authorization Manager. Oct 14 02:39:54 localhost cloud-init[917]: Generating public/private rsa key pair. Oct 14 02:39:54 localhost cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Oct 14 02:39:54 localhost cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Oct 14 02:39:54 localhost cloud-init[917]: The key fingerprint is: Oct 14 02:39:54 localhost cloud-init[917]: SHA256:8/GLzF0xYEGc6JSjK+IUey0g69YWcAZKIs11gKQFDnw root@np0005486733.novalocal Oct 14 02:39:54 localhost cloud-init[917]: The key's randomart image is: Oct 14 02:39:54 localhost cloud-init[917]: +---[RSA 3072]----+ Oct 14 02:39:54 localhost cloud-init[917]: |+=ooo.. =o. | Oct 14 02:39:54 localhost cloud-init[917]: |===E . = o. | Oct 14 02:39:54 localhost cloud-init[917]: |=o.. + .o | Oct 14 02:39:54 localhost cloud-init[917]: |. o = . .. . | Oct 14 02:39:54 localhost cloud-init[917]: | * + .S.. o | Oct 14 02:39:54 localhost cloud-init[917]: | . = + oo o o | Oct 14 02:39:54 localhost cloud-init[917]: | . + + o . . . | Oct 14 02:39:54 localhost cloud-init[917]: | o + o o o | Oct 14 02:39:54 localhost cloud-init[917]: | . . + o | Oct 14 02:39:54 localhost cloud-init[917]: +----[SHA256]-----+ Oct 14 02:39:54 localhost cloud-init[917]: Generating public/private ecdsa key pair. Oct 14 02:39:54 localhost cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Oct 14 02:39:54 localhost cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Oct 14 02:39:54 localhost cloud-init[917]: The key fingerprint is: Oct 14 02:39:54 localhost cloud-init[917]: SHA256:h0gIXM7QidxqeFMecfsVh7ZPlMhGG1lnqbeldwfendA root@np0005486733.novalocal Oct 14 02:39:54 localhost cloud-init[917]: The key's randomart image is: Oct 14 02:39:54 localhost cloud-init[917]: +---[ECDSA 256]---+ Oct 14 02:39:54 localhost cloud-init[917]: | oo=oo. o+=o.o. | Oct 14 02:39:54 localhost cloud-init[917]: | +=*o . B=oo. | Oct 14 02:39:54 localhost cloud-init[917]: | . ++.o ooo .. | Oct 14 02:39:54 localhost cloud-init[917]: |. = .. o o. o.oE.| Oct 14 02:39:54 localhost cloud-init[917]: | o . . S .o o.*o| Oct 14 02:39:54 localhost cloud-init[917]: | . . +.*| Oct 14 02:39:54 localhost cloud-init[917]: | .o| Oct 14 02:39:54 localhost cloud-init[917]: | | Oct 14 02:39:54 localhost cloud-init[917]: | | Oct 14 02:39:54 localhost cloud-init[917]: +----[SHA256]-----+ Oct 14 02:39:54 localhost cloud-init[917]: Generating public/private ed25519 key pair. Oct 14 02:39:54 localhost cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Oct 14 02:39:54 localhost cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Oct 14 02:39:54 localhost cloud-init[917]: The key fingerprint is: Oct 14 02:39:54 localhost cloud-init[917]: SHA256:67ZIubzHjDHfDY1Bcp/TsE4bb4n1AzWNxqAiBdt6tLM root@np0005486733.novalocal Oct 14 02:39:54 localhost cloud-init[917]: The key's randomart image is: Oct 14 02:39:54 localhost cloud-init[917]: +--[ED25519 256]--+ Oct 14 02:39:54 localhost cloud-init[917]: | ... .o ..| Oct 14 02:39:54 localhost cloud-init[917]: | +. o.. +o.| Oct 14 02:39:54 localhost cloud-init[917]: | o ++.. *. .| Oct 14 02:39:54 localhost cloud-init[917]: | + o. B.o | Oct 14 02:39:54 localhost cloud-init[917]: | . S * B.o | Oct 14 02:39:54 localhost cloud-init[917]: | oo +o = +..| Oct 14 02:39:54 localhost cloud-init[917]: | oBE. o . .| Oct 14 02:39:54 localhost cloud-init[917]: | o.+* . . | Oct 14 02:39:54 localhost cloud-init[917]: | =+o. | Oct 14 02:39:54 localhost cloud-init[917]: +----[SHA256]-----+ Oct 14 02:39:54 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Oct 14 02:39:54 localhost systemd[1]: Reached target Cloud-config availability. Oct 14 02:39:54 localhost systemd[1]: Reached target Network is Online. Oct 14 02:39:54 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Oct 14 02:39:54 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Oct 14 02:39:54 localhost systemd[1]: Starting Crash recovery kernel arming... Oct 14 02:39:54 localhost systemd[1]: Starting Notify NFS peers of a restart... Oct 14 02:39:54 localhost systemd[1]: Starting OpenSSH server daemon... Oct 14 02:39:54 localhost sm-notify[1132]: Version 2.5.4 starting Oct 14 02:39:54 localhost systemd[1]: Starting Permit User Sessions... Oct 14 02:39:54 localhost systemd[1]: Started Notify NFS peers of a restart. Oct 14 02:39:54 localhost sshd[1133]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:54 localhost systemd[1]: Started OpenSSH server daemon. Oct 14 02:39:54 localhost systemd[1]: Finished Permit User Sessions. Oct 14 02:39:54 localhost systemd[1]: Started Command Scheduler. Oct 14 02:39:54 localhost systemd[1]: Started Getty on tty1. Oct 14 02:39:54 localhost systemd[1]: Started Serial Getty on ttyS0. Oct 14 02:39:54 localhost systemd[1]: Reached target Login Prompts. Oct 14 02:39:54 localhost systemd[1]: Reached target Multi-User System. Oct 14 02:39:54 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Oct 14 02:39:54 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Oct 14 02:39:54 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Oct 14 02:39:54 localhost kdumpctl[1139]: kdump: No kdump initial ramdisk found. Oct 14 02:39:54 localhost kdumpctl[1139]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Oct 14 02:39:55 localhost cloud-init[1250]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Tue, 14 Oct 2025 06:39:54 +0000. Up 11.18 seconds. Oct 14 02:39:55 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Oct 14 02:39:55 localhost systemd[1]: Starting Execute cloud user/final scripts... Oct 14 02:39:55 localhost dracut[1418]: dracut-057-21.git20230214.el9 Oct 14 02:39:55 localhost cloud-init[1433]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Tue, 14 Oct 2025 06:39:55 +0000. Up 11.55 seconds. Oct 14 02:39:55 localhost cloud-init[1437]: ############################################################# Oct 14 02:39:55 localhost cloud-init[1438]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Oct 14 02:39:55 localhost cloud-init[1440]: 256 SHA256:h0gIXM7QidxqeFMecfsVh7ZPlMhGG1lnqbeldwfendA root@np0005486733.novalocal (ECDSA) Oct 14 02:39:55 localhost cloud-init[1442]: 256 SHA256:67ZIubzHjDHfDY1Bcp/TsE4bb4n1AzWNxqAiBdt6tLM root@np0005486733.novalocal (ED25519) Oct 14 02:39:55 localhost cloud-init[1446]: 3072 SHA256:8/GLzF0xYEGc6JSjK+IUey0g69YWcAZKIs11gKQFDnw root@np0005486733.novalocal (RSA) Oct 14 02:39:55 localhost cloud-init[1447]: -----END SSH HOST KEY FINGERPRINTS----- Oct 14 02:39:55 localhost cloud-init[1449]: ############################################################# Oct 14 02:39:55 localhost dracut[1421]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Oct 14 02:39:55 localhost cloud-init[1433]: Cloud-init v. 22.1-9.el9 finished at Tue, 14 Oct 2025 06:39:55 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 11.75 seconds Oct 14 02:39:55 localhost chronyd[766]: Selected source 162.159.200.123 (2.rhel.pool.ntp.org) Oct 14 02:39:55 localhost chronyd[766]: System clock TAI offset set to 37 seconds Oct 14 02:39:55 localhost systemd[1]: Reloading Network Manager... Oct 14 02:39:55 localhost NetworkManager[789]: [1760423995.6702] audit: op="reload" arg="0" pid=1519 uid=0 result="success" Oct 14 02:39:55 localhost NetworkManager[789]: [1760423995.6709] config: signal: SIGHUP (no changes from disk) Oct 14 02:39:55 localhost systemd[1]: Reloaded Network Manager. Oct 14 02:39:55 localhost systemd[1]: Finished Execute cloud user/final scripts. Oct 14 02:39:55 localhost systemd[1]: Reached target Cloud-init target. Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Oct 14 02:39:55 localhost sshd[1564]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Oct 14 02:39:55 localhost sshd[1580]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Oct 14 02:39:55 localhost sshd[1594]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Oct 14 02:39:55 localhost sshd[1604]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Oct 14 02:39:55 localhost sshd[1611]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Oct 14 02:39:55 localhost sshd[1623]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Oct 14 02:39:55 localhost sshd[1638]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Oct 14 02:39:55 localhost sshd[1658]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Oct 14 02:39:55 localhost sshd[1673]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:39:55 localhost dracut[1421]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Oct 14 02:39:55 localhost dracut[1421]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Oct 14 02:39:56 localhost dracut[1421]: memstrack is not available Oct 14 02:39:56 localhost dracut[1421]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Oct 14 02:39:56 localhost dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Oct 14 02:39:56 localhost dracut[1421]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Oct 14 02:39:56 localhost dracut[1421]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Oct 14 02:39:56 localhost dracut[1421]: memstrack is not available Oct 14 02:39:56 localhost dracut[1421]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Oct 14 02:39:56 localhost dracut[1421]: *** Including module: systemd *** Oct 14 02:39:56 localhost dracut[1421]: *** Including module: systemd-initrd *** Oct 14 02:39:57 localhost dracut[1421]: *** Including module: i18n *** Oct 14 02:39:57 localhost dracut[1421]: No KEYMAP configured. Oct 14 02:39:57 localhost dracut[1421]: *** Including module: drm *** Oct 14 02:39:57 localhost dracut[1421]: *** Including module: prefixdevname *** Oct 14 02:39:57 localhost dracut[1421]: *** Including module: kernel-modules *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: kernel-modules-extra *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: qemu *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: fstab-sys *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: rootfs-block *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: terminfo *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: udev-rules *** Oct 14 02:39:58 localhost dracut[1421]: Skipping udev rule: 91-permissions.rules Oct 14 02:39:58 localhost dracut[1421]: Skipping udev rule: 80-drivers-modprobe.rules Oct 14 02:39:58 localhost dracut[1421]: *** Including module: virtiofs *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: dracut-systemd *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: usrmount *** Oct 14 02:39:58 localhost dracut[1421]: *** Including module: base *** Oct 14 02:39:59 localhost dracut[1421]: *** Including module: fs-lib *** Oct 14 02:39:59 localhost dracut[1421]: *** Including module: kdumpbase *** Oct 14 02:39:59 localhost dracut[1421]: *** Including module: microcode_ctl-fw_dir_override *** Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl module: mangling fw_dir Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-2d-07" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-4e-03" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-4f-01" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-55-04" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-5e-03" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-8c-01" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Oct 14 02:39:59 localhost dracut[1421]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Oct 14 02:39:59 localhost dracut[1421]: *** Including module: shutdown *** Oct 14 02:39:59 localhost dracut[1421]: *** Including module: squash *** Oct 14 02:39:59 localhost dracut[1421]: *** Including modules done *** Oct 14 02:39:59 localhost dracut[1421]: *** Installing kernel module dependencies *** Oct 14 02:40:00 localhost dracut[1421]: *** Installing kernel module dependencies done *** Oct 14 02:40:00 localhost dracut[1421]: *** Resolving executable dependencies *** Oct 14 02:40:00 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Oct 14 02:40:01 localhost dracut[1421]: *** Resolving executable dependencies done *** Oct 14 02:40:01 localhost dracut[1421]: *** Hardlinking files *** Oct 14 02:40:02 localhost dracut[1421]: Mode: real Oct 14 02:40:02 localhost dracut[1421]: Files: 1099 Oct 14 02:40:02 localhost dracut[1421]: Linked: 3 files Oct 14 02:40:02 localhost dracut[1421]: Compared: 0 xattrs Oct 14 02:40:02 localhost dracut[1421]: Compared: 373 files Oct 14 02:40:02 localhost dracut[1421]: Saved: 61.04 KiB Oct 14 02:40:02 localhost dracut[1421]: Duration: 0.049830 seconds Oct 14 02:40:02 localhost dracut[1421]: *** Hardlinking files done *** Oct 14 02:40:02 localhost dracut[1421]: Could not find 'strip'. Not stripping the initramfs. Oct 14 02:40:02 localhost dracut[1421]: *** Generating early-microcode cpio image *** Oct 14 02:40:02 localhost dracut[1421]: *** Constructing AuthenticAMD.bin *** Oct 14 02:40:02 localhost dracut[1421]: *** Store current command line parameters *** Oct 14 02:40:02 localhost dracut[1421]: Stored kernel commandline: Oct 14 02:40:02 localhost dracut[1421]: No dracut internal kernel commandline stored in the initramfs Oct 14 02:40:02 localhost dracut[1421]: *** Install squash loader *** Oct 14 02:40:02 localhost dracut[1421]: *** Squashing the files inside the initramfs *** Oct 14 02:40:03 localhost dracut[1421]: *** Squashing the files inside the initramfs done *** Oct 14 02:40:03 localhost dracut[1421]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Oct 14 02:40:04 localhost dracut[1421]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Oct 14 02:40:04 localhost kdumpctl[1139]: kdump: kexec: loaded kdump kernel Oct 14 02:40:04 localhost kdumpctl[1139]: kdump: Starting kdump: [OK] Oct 14 02:40:04 localhost systemd[1]: Finished Crash recovery kernel arming. Oct 14 02:40:04 localhost systemd[1]: Startup finished in 1.194s (kernel) + 2.066s (initrd) + 17.509s (userspace) = 20.770s. Oct 14 02:40:20 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Oct 14 02:42:30 localhost systemd[1]: Unmounting EFI System Partition Automount... Oct 14 02:42:30 localhost systemd[1]: efi.mount: Deactivated successfully. Oct 14 02:42:30 localhost systemd[1]: Unmounted EFI System Partition Automount. Oct 14 02:43:20 localhost sshd[4180]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:43:38 localhost sshd[4181]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:48:17 localhost sshd[4185]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:52:11 localhost sshd[4188]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:52:11 localhost systemd[1]: Created slice User Slice of UID 1000. Oct 14 02:52:11 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Oct 14 02:52:11 localhost systemd-logind[760]: New session 1 of user zuul. Oct 14 02:52:11 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Oct 14 02:52:11 localhost systemd[1]: Starting User Manager for UID 1000... Oct 14 02:52:11 localhost systemd[4192]: Queued start job for default target Main User Target. Oct 14 02:52:11 localhost systemd[4192]: Created slice User Application Slice. Oct 14 02:52:11 localhost systemd[4192]: Started Mark boot as successful after the user session has run 2 minutes. Oct 14 02:52:11 localhost systemd[4192]: Started Daily Cleanup of User's Temporary Directories. Oct 14 02:52:11 localhost systemd[4192]: Reached target Paths. Oct 14 02:52:11 localhost systemd[4192]: Reached target Timers. Oct 14 02:52:11 localhost systemd[4192]: Starting D-Bus User Message Bus Socket... Oct 14 02:52:11 localhost systemd[4192]: Starting Create User's Volatile Files and Directories... Oct 14 02:52:11 localhost systemd[4192]: Listening on D-Bus User Message Bus Socket. Oct 14 02:52:11 localhost systemd[4192]: Reached target Sockets. Oct 14 02:52:11 localhost systemd[4192]: Finished Create User's Volatile Files and Directories. Oct 14 02:52:11 localhost systemd[4192]: Reached target Basic System. Oct 14 02:52:11 localhost systemd[4192]: Reached target Main User Target. Oct 14 02:52:11 localhost systemd[4192]: Startup finished in 123ms. Oct 14 02:52:11 localhost systemd[1]: Started User Manager for UID 1000. Oct 14 02:52:11 localhost systemd[1]: Started Session 1 of User zuul. Oct 14 02:52:11 localhost python3[4245]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 02:52:24 localhost python3[4264]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 02:52:31 localhost python3[4316]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 02:52:33 localhost python3[4346]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Oct 14 02:52:36 localhost python3[4362]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:52:36 localhost python3[4376]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:37 localhost python3[4435]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:52:38 localhost python3[4476]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760424757.6898775-394-207479539088520/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6c04c38dfe8e446399a5e5f9dbe4740b_id_rsa follow=False checksum=ca0549f1043aa781cfe5001a3649a4105abf4f82 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:39 localhost python3[4549]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:52:39 localhost python3[4590]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760424759.2774718-494-266137178110341/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6c04c38dfe8e446399a5e5f9dbe4740b_id_rsa.pub follow=False checksum=8b573aa2906c160b2f7b53c64bd37790afdd4394 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:41 localhost python3[4618]: ansible-ping Invoked with data=pong Oct 14 02:52:43 localhost python3[4632]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 02:52:47 localhost python3[4684]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Oct 14 02:52:50 localhost python3[4706]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:50 localhost python3[4720]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:50 localhost python3[4734]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:51 localhost python3[4748]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:52 localhost python3[4762]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:52 localhost python3[4776]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:54 localhost python3[4792]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:52:56 localhost python3[4840]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:52:56 localhost python3[4883]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760424775.9858994-103-60447058887715/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:04 localhost python3[4911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:04 localhost python3[4925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:04 localhost python3[4939]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:05 localhost python3[4953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:05 localhost python3[4967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:05 localhost python3[4981]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:05 localhost python3[4995]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:06 localhost python3[5009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:06 localhost python3[5023]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:06 localhost python3[5037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:07 localhost python3[5051]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:07 localhost python3[5065]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:07 localhost python3[5079]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:07 localhost python3[5093]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:08 localhost python3[5107]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:08 localhost python3[5121]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:08 localhost python3[5135]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:08 localhost python3[5149]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:09 localhost python3[5163]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:09 localhost python3[5177]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:09 localhost python3[5191]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:10 localhost python3[5205]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:10 localhost python3[5219]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:10 localhost python3[5233]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:10 localhost python3[5247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:11 localhost python3[5261]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 02:53:12 localhost python3[5277]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Oct 14 02:53:12 localhost systemd[1]: Starting Time & Date Service... Oct 14 02:53:12 localhost systemd[1]: Started Time & Date Service. Oct 14 02:53:12 localhost systemd-timedated[5279]: Changed time zone to 'UTC' (UTC). Oct 14 02:53:13 localhost python3[5298]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:15 localhost python3[5344]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:53:15 localhost python3[5385]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1760424794.7750154-499-149548107281415/source _original_basename=tmp__j1ff5s follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:16 localhost python3[5445]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:53:16 localhost python3[5486]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760424796.3270578-591-8098791883027/source _original_basename=tmposkcna2p follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:18 localhost python3[5548]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:53:19 localhost python3[5591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760424798.414809-733-83017023308743/source _original_basename=tmpa08lpbjq follow=False checksum=8298daad36b04db6ae7a2e2e919ac6744ee002c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:20 localhost python3[5619]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 02:53:20 localhost python3[5635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 02:53:21 localhost python3[5685]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:53:21 localhost python3[5728]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1760424801.341867-858-121272470165193/source _original_basename=tmp_5iz048k follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:23 localhost python3[5759]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-51fb-2668-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 02:53:24 localhost python3[5777]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-51fb-2668-000000000024-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Oct 14 02:53:26 localhost python3[5795]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:53:42 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Oct 14 02:53:44 localhost python3[5813]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:54:30 localhost systemd[4192]: Starting Mark boot as successful... Oct 14 02:54:30 localhost systemd[4192]: Finished Mark boot as successful. Oct 14 02:54:45 localhost systemd[1]: Starting Cleanup of Temporary Directories... Oct 14 02:54:45 localhost systemd-logind[760]: Session 1 logged out. Waiting for processes to exit. Oct 14 02:54:45 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Oct 14 02:54:45 localhost systemd[1]: Finished Cleanup of Temporary Directories. Oct 14 02:54:45 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Oct 14 02:57:30 localhost systemd[4192]: Created slice User Background Tasks Slice. Oct 14 02:57:30 localhost systemd[4192]: Starting Cleanup of User's Temporary Files and Directories... Oct 14 02:57:30 localhost systemd[4192]: Finished Cleanup of User's Temporary Files and Directories. Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Oct 14 02:57:52 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Oct 14 02:57:52 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7225] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Oct 14 02:57:52 localhost systemd-udevd[5822]: Network interface NamePolicy= disabled on kernel command line. Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7381] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7429] settings: (eth1): created default wired connection 'Wired connection 1' Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7435] device (eth1): carrier: link connected Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7439] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7447] policy: auto-activating connection 'Wired connection 1' (7d00badf-b811-3c0c-86ef-e88a614366e0) Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7454] device (eth1): Activation: starting connection 'Wired connection 1' (7d00badf-b811-3c0c-86ef-e88a614366e0) Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7456] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7464] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7472] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Oct 14 02:57:52 localhost NetworkManager[789]: [1760425072.7480] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Oct 14 02:57:53 localhost sshd[5824]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:57:53 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Oct 14 02:57:53 localhost systemd-logind[760]: New session 3 of user zuul. Oct 14 02:57:53 localhost systemd[1]: Started Session 3 of User zuul. Oct 14 02:57:54 localhost python3[5841]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-99d4-3d1d-000000000475-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 02:58:07 localhost python3[5892]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:58:07 localhost python3[5935]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760425087.0843246-537-136962058012191/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=f959702102de964ed0295a8d76011a0e7edb4d64 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:58:08 localhost python3[5965]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 02:58:08 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Oct 14 02:58:08 localhost systemd[1]: Stopped Network Manager Wait Online. Oct 14 02:58:08 localhost systemd[1]: Stopping Network Manager Wait Online... Oct 14 02:58:08 localhost systemd[1]: Stopping Network Manager... Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3072] caught SIGTERM, shutting down normally. Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3198] dhcp4 (eth0): canceled DHCP transaction Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3199] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3199] dhcp4 (eth0): state changed no lease Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3209] manager: NetworkManager state is now CONNECTING Oct 14 02:58:08 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3307] dhcp4 (eth1): canceled DHCP transaction Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3308] dhcp4 (eth1): state changed no lease Oct 14 02:58:08 localhost NetworkManager[789]: [1760425088.3371] exiting (success) Oct 14 02:58:08 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Oct 14 02:58:08 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Oct 14 02:58:08 localhost systemd[1]: Stopped Network Manager. Oct 14 02:58:08 localhost systemd[1]: NetworkManager.service: Consumed 6.101s CPU time. Oct 14 02:58:08 localhost systemd[1]: Starting Network Manager... Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.3916] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:6bfe016f-8d50-419d-b828-da4460617f42) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.3918] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Oct 14 02:58:08 localhost systemd[1]: Started Network Manager. Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.3945] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Oct 14 02:58:08 localhost systemd[1]: Starting Network Manager Wait Online... Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4000] manager[0x560cec888090]: monitoring kernel firmware directory '/lib/firmware'. Oct 14 02:58:08 localhost systemd[1]: Starting Hostname Service... Oct 14 02:58:08 localhost systemd[1]: Started Hostname Service. Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4684] hostname: hostname: using hostnamed Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4685] hostname: static hostname changed from (none) to "np0005486733.novalocal" Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4691] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4698] manager[0x560cec888090]: rfkill: Wi-Fi hardware radio set enabled Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4699] manager[0x560cec888090]: rfkill: WWAN hardware radio set enabled Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4741] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4741] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4742] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4743] manager: Networking is enabled by state file Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4750] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4755] settings: Loaded settings plugin: keyfile (internal) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4795] dhcp: init: Using DHCP client 'internal' Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4801] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4809] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4819] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4835] device (lo): Activation: starting connection 'lo' (eaf496c7-1f18-48c4-bdf3-53eb5b9ead4a) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4846] device (eth0): carrier: link connected Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4852] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4861] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4861] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4875] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4885] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4892] device (eth1): carrier: link connected Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4899] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4910] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (7d00badf-b811-3c0c-86ef-e88a614366e0) (indicated) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4911] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4920] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4931] device (eth1): Activation: starting connection 'Wired connection 1' (7d00badf-b811-3c0c-86ef-e88a614366e0) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4960] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4964] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4967] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4970] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4973] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4977] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.4980] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5006] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5014] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5019] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5030] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5033] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5054] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5061] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5070] device (lo): Activation: successful, device activated. Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5082] dhcp4 (eth0): state changed new lease, address=38.102.83.143 Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5088] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5170] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5215] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5218] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5224] manager: NetworkManager state is now CONNECTED_SITE Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5231] device (eth0): Activation: successful, device activated. Oct 14 02:58:08 localhost NetworkManager[5977]: [1760425088.5242] manager: NetworkManager state is now CONNECTED_GLOBAL Oct 14 02:58:08 localhost python3[6039]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-99d4-3d1d-000000000136-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 02:58:18 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Oct 14 02:58:38 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Oct 14 02:58:53 localhost NetworkManager[5977]: [1760425133.8083] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:53 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Oct 14 02:58:53 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Oct 14 02:58:53 localhost NetworkManager[5977]: [1760425133.8332] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:53 localhost NetworkManager[5977]: [1760425133.8335] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Oct 14 02:58:53 localhost NetworkManager[5977]: [1760425133.8344] device (eth1): Activation: successful, device activated. Oct 14 02:58:53 localhost NetworkManager[5977]: [1760425133.8353] manager: startup complete Oct 14 02:58:53 localhost systemd[1]: Finished Network Manager Wait Online. Oct 14 02:59:03 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Oct 14 02:59:08 localhost systemd[1]: session-3.scope: Deactivated successfully. Oct 14 02:59:08 localhost systemd[1]: session-3.scope: Consumed 1.474s CPU time. Oct 14 02:59:08 localhost systemd-logind[760]: Session 3 logged out. Waiting for processes to exit. Oct 14 02:59:08 localhost systemd-logind[760]: Removed session 3. Oct 14 02:59:28 localhost sshd[6065]: main: sshd: ssh-rsa algorithm is disabled Oct 14 02:59:28 localhost systemd-logind[760]: New session 4 of user zuul. Oct 14 02:59:28 localhost systemd[1]: Started Session 4 of User zuul. Oct 14 02:59:29 localhost python3[6116]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 02:59:29 localhost python3[6159]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760425168.8807044-628-25661556715602/source _original_basename=tmp4e189gck follow=False checksum=abc21b3971e70fb47653ad1df5ee2cc661041e3d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 02:59:31 localhost systemd[1]: session-4.scope: Deactivated successfully. Oct 14 02:59:31 localhost systemd-logind[760]: Session 4 logged out. Waiting for processes to exit. Oct 14 02:59:31 localhost systemd-logind[760]: Removed session 4. Oct 14 03:05:02 localhost sshd[6190]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:05:30 localhost sshd[6193]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:05:30 localhost systemd-logind[760]: New session 5 of user zuul. Oct 14 03:05:30 localhost systemd[1]: Started Session 5 of User zuul. Oct 14 03:05:30 localhost python3[6212]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-72d1-c0ab-000000001d20-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:05:31 localhost python3[6231]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:05:32 localhost python3[6247]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:05:32 localhost python3[6263]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:05:32 localhost python3[6279]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:05:33 localhost python3[6295]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:05:33 localhost python3[6295]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually Oct 14 03:05:34 localhost python3[6311]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 03:05:34 localhost systemd[1]: Reloading. Oct 14 03:05:34 localhost systemd-rc-local-generator[6330]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:05:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:05:36 localhost python3[6359]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Oct 14 03:05:37 localhost python3[6375]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:05:38 localhost python3[6393]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:05:38 localhost python3[6411]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:05:38 localhost python3[6429]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:05:39 localhost python3[6446]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-72d1-c0ab-000000001d26-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:05:40 localhost python3[6466]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:05:43 localhost systemd[1]: session-5.scope: Deactivated successfully. Oct 14 03:05:43 localhost systemd[1]: session-5.scope: Consumed 3.263s CPU time. Oct 14 03:05:43 localhost systemd-logind[760]: Session 5 logged out. Waiting for processes to exit. Oct 14 03:05:43 localhost systemd-logind[760]: Removed session 5. Oct 14 03:07:14 localhost sshd[6473]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:07:14 localhost systemd-logind[760]: New session 6 of user zuul. Oct 14 03:07:14 localhost systemd[1]: Started Session 6 of User zuul. Oct 14 03:07:16 localhost systemd[1]: Starting RHSM dbus service... Oct 14 03:07:16 localhost systemd[1]: Started RHSM dbus service. Oct 14 03:07:16 localhost rhsm-service[6497]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Oct 14 03:07:16 localhost rhsm-service[6497]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Oct 14 03:07:16 localhost rhsm-service[6497]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Oct 14 03:07:16 localhost rhsm-service[6497]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Oct 14 03:07:17 localhost rhsm-service[6497]: INFO [subscription_manager.managerlib:90] Consumer created: np0005486733.novalocal (2fbb0dcc-d71d-422c-902e-d4c1db889f51) Oct 14 03:07:17 localhost subscription-manager[6497]: Registered system with identity: 2fbb0dcc-d71d-422c-902e-d4c1db889f51 Oct 14 03:07:18 localhost rhsm-service[6497]: INFO [subscription_manager.entcertlib:131] certs updated: Oct 14 03:07:18 localhost rhsm-service[6497]: Total updates: 1 Oct 14 03:07:18 localhost rhsm-service[6497]: Found (local) serial# [] Oct 14 03:07:18 localhost rhsm-service[6497]: Expected (UEP) serial# [2357020625818107267] Oct 14 03:07:18 localhost rhsm-service[6497]: Added (new) Oct 14 03:07:18 localhost rhsm-service[6497]: [sn:2357020625818107267 ( Content Access,) @ /etc/pki/entitlement/2357020625818107267.pem] Oct 14 03:07:18 localhost rhsm-service[6497]: Deleted (rogue): Oct 14 03:07:18 localhost rhsm-service[6497]: Oct 14 03:07:18 localhost subscription-manager[6497]: Added subscription for 'Content Access' contract 'None' Oct 14 03:07:18 localhost subscription-manager[6497]: Added subscription for product ' Content Access' Oct 14 03:07:19 localhost rhsm-service[6497]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Oct 14 03:07:19 localhost rhsm-service[6497]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Oct 14 03:07:19 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:07:19 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:07:20 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:07:20 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:07:20 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:07:22 localhost python3[6588]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1d49-1048-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:08:16 localhost python3[6607]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:08:45 localhost setsebool[6682]: The virt_use_nfs policy boolean was changed to 1 by root Oct 14 03:08:45 localhost setsebool[6682]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Oct 14 03:08:53 localhost kernel: SELinux: Converting 407 SID table entries... Oct 14 03:08:53 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 03:08:53 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 03:08:53 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 03:08:53 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 03:08:53 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 03:08:53 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 03:08:53 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 03:09:06 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=3 res=1 Oct 14 03:09:06 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:09:06 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 03:09:06 localhost systemd[1]: Reloading. Oct 14 03:09:06 localhost systemd-rc-local-generator[7500]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:09:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:09:06 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 03:09:08 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:09:08 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:09:11 localhost podman[14288]: 2025-10-14 07:09:11.810526625 +0000 UTC m=+0.110705407 system refresh Oct 14 03:09:12 localhost systemd[4192]: Starting D-Bus User Message Bus... Oct 14 03:09:12 localhost dbus-broker-launch[15704]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Oct 14 03:09:12 localhost dbus-broker-launch[15704]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Oct 14 03:09:12 localhost systemd[4192]: Started D-Bus User Message Bus. Oct 14 03:09:12 localhost journal[15704]: Ready Oct 14 03:09:12 localhost systemd[4192]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Oct 14 03:09:12 localhost systemd[4192]: Created slice Slice /user. Oct 14 03:09:12 localhost systemd[4192]: podman-15527.scope: unit configures an IP firewall, but not running as root. Oct 14 03:09:12 localhost systemd[4192]: (This warning is only shown for the first unit using IP firewalling.) Oct 14 03:09:12 localhost systemd[4192]: Started podman-15527.scope. Oct 14 03:09:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:09:13 localhost systemd[4192]: Started podman-pause-7423eb28.scope. Oct 14 03:09:13 localhost systemd[1]: session-6.scope: Deactivated successfully. Oct 14 03:09:13 localhost systemd[1]: session-6.scope: Consumed 50.081s CPU time. Oct 14 03:09:13 localhost systemd-logind[760]: Session 6 logged out. Waiting for processes to exit. Oct 14 03:09:13 localhost systemd-logind[760]: Removed session 6. Oct 14 03:09:14 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 03:09:14 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 03:09:14 localhost systemd[1]: man-db-cache-update.service: Consumed 9.410s CPU time. Oct 14 03:09:14 localhost systemd[1]: run-r9eebba42c8f34c778eff0a97626db051.service: Deactivated successfully. Oct 14 03:09:29 localhost sshd[18342]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:09:29 localhost sshd[18340]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:09:29 localhost sshd[18341]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:09:29 localhost sshd[18339]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:09:29 localhost sshd[18338]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:09:34 localhost sshd[18348]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:09:34 localhost systemd-logind[760]: New session 7 of user zuul. Oct 14 03:09:34 localhost systemd[1]: Started Session 7 of User zuul. Oct 14 03:09:34 localhost python3[18365]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIq9But+/Hfc9J5vjzjHcMTQnDUUku1RFL7dcQIHYNLTUIGZ0AQaJy5Ycn5J06z6gzZ6xEr0ccDbinQsuD7Dk3c= zuul@np0005486725.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:09:35 localhost python3[18381]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIq9But+/Hfc9J5vjzjHcMTQnDUUku1RFL7dcQIHYNLTUIGZ0AQaJy5Ycn5J06z6gzZ6xEr0ccDbinQsuD7Dk3c= zuul@np0005486725.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:09:37 localhost systemd[1]: session-7.scope: Deactivated successfully. Oct 14 03:09:37 localhost systemd-logind[760]: Session 7 logged out. Waiting for processes to exit. Oct 14 03:09:37 localhost systemd-logind[760]: Removed session 7. Oct 14 03:10:59 localhost sshd[18383]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:10:59 localhost systemd-logind[760]: New session 8 of user zuul. Oct 14 03:10:59 localhost systemd[1]: Started Session 8 of User zuul. Oct 14 03:11:00 localhost python3[18402]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:11:00 localhost python3[18418]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486733.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Oct 14 03:11:02 localhost python3[18468]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:11:02 localhost python3[18511]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760425862.302841-139-260415598768661/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=6c04c38dfe8e446399a5e5f9dbe4740b_id_rsa follow=False checksum=ca0549f1043aa781cfe5001a3649a4105abf4f82 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:11:04 localhost python3[18573]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:11:04 localhost python3[18616]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760425863.9216893-228-53998857888031/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=6c04c38dfe8e446399a5e5f9dbe4740b_id_rsa.pub follow=False checksum=8b573aa2906c160b2f7b53c64bd37790afdd4394 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:11:06 localhost python3[18646]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:11:07 localhost python3[18692]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:11:07 localhost python3[18708]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpf9je02za recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:11:08 localhost python3[18768]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:11:09 localhost python3[18784]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpy_2so3j7 recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:11:10 localhost python3[18844]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:11:10 localhost python3[18860]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpsgklnaqn recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:11:11 localhost systemd[1]: session-8.scope: Deactivated successfully. Oct 14 03:11:11 localhost systemd[1]: session-8.scope: Consumed 3.320s CPU time. Oct 14 03:11:11 localhost systemd-logind[760]: Session 8 logged out. Waiting for processes to exit. Oct 14 03:11:11 localhost systemd-logind[760]: Removed session 8. Oct 14 03:11:20 localhost systemd[1]: Starting dnf makecache... Oct 14 03:11:20 localhost dnf[18876]: Updating Subscription Management repositories. Oct 14 03:11:22 localhost dnf[18876]: Failed determining last makecache time. Oct 14 03:11:22 localhost dnf[18876]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 29 kB/s | 4.1 kB 00:00 Oct 14 03:11:22 localhost dnf[18876]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 33 kB/s | 4.5 kB 00:00 Oct 14 03:11:23 localhost dnf[18876]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 33 kB/s | 4.5 kB 00:00 Oct 14 03:11:23 localhost dnf[18876]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 30 kB/s | 4.1 kB 00:00 Oct 14 03:11:23 localhost dnf[18876]: Metadata cache created. Oct 14 03:11:23 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Oct 14 03:11:23 localhost systemd[1]: Finished dnf makecache. Oct 14 03:11:23 localhost systemd[1]: dnf-makecache.service: Consumed 2.572s CPU time. Oct 14 03:13:14 localhost sshd[18882]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:13:14 localhost systemd-logind[760]: New session 9 of user zuul. Oct 14 03:13:14 localhost systemd[1]: Started Session 9 of User zuul. Oct 14 03:13:15 localhost python3[18928]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:18:14 localhost systemd[1]: session-9.scope: Deactivated successfully. Oct 14 03:18:14 localhost systemd-logind[760]: Session 9 logged out. Waiting for processes to exit. Oct 14 03:18:14 localhost systemd-logind[760]: Removed session 9. Oct 14 03:23:44 localhost sshd[18936]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:23:44 localhost systemd-logind[760]: New session 10 of user zuul. Oct 14 03:23:44 localhost systemd[1]: Started Session 10 of User zuul. Oct 14 03:23:44 localhost python3[18953]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-f848-5676-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:23:48 localhost python3[18973]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-f848-5676-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:24:21 localhost python3[18993]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Oct 14 03:24:24 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:24:59 localhost python3[19149]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Oct 14 03:25:02 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:02 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:21 localhost python3[19291]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Oct 14 03:25:24 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:24 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:30 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:30 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:52 localhost python3[19686]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Oct 14 03:25:56 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:25:56 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:26:01 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:26:01 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:26:25 localhost python3[20022]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Oct 14 03:26:28 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:26:33 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:26:33 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:26:44 localhost python3[20359]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f848-5676-000000000013-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:27:13 localhost python3[20378]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:27:23 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Oct 14 03:27:31 localhost kernel: SELinux: Converting 501 SID table entries... Oct 14 03:27:31 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 03:27:31 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 03:27:31 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 03:27:31 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 03:27:31 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 03:27:31 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 03:27:31 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 03:27:35 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=4 res=1 Oct 14 03:27:35 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:27:35 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 03:27:35 localhost systemd[1]: Reloading. Oct 14 03:27:35 localhost systemd-rc-local-generator[21035]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:27:35 localhost systemd-sysv-generator[21040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:27:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:27:35 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 03:27:36 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 03:27:36 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 03:27:36 localhost systemd[1]: run-r1eb6473c3fea47acbf9f9487a2262b85.service: Deactivated successfully. Oct 14 03:27:37 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:27:37 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 03:27:49 localhost python3[21686]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f848-5676-000000000015-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:28:05 localhost python3[21706]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:28:06 localhost python3[21754]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:28:07 localhost python3[21797]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760426886.499339-336-14020813948421/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=91bc45728dd9738fc644e3ada9d8642294da29ff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:28:09 localhost python3[21827]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Oct 14 03:28:09 localhost systemd-journald[618]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 91.6 (305 of 333 items), suggesting rotation. Oct 14 03:28:09 localhost systemd-journald[618]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 03:28:09 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 03:28:09 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 03:28:09 localhost python3[21848]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Oct 14 03:28:09 localhost python3[21868]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Oct 14 03:28:10 localhost python3[21888]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Oct 14 03:28:10 localhost python3[21908]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Oct 14 03:28:14 localhost python3[21928]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 03:28:14 localhost systemd[1]: Starting LSB: Bring up/down networking... Oct 14 03:28:14 localhost network[21931]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 03:28:14 localhost network[21942]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 03:28:14 localhost network[21931]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:14 localhost network[21943]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:14 localhost network[21931]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Oct 14 03:28:14 localhost network[21944]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 03:28:14 localhost NetworkManager[5977]: [1760426894.4328] audit: op="connections-reload" pid=21972 uid=0 result="success" Oct 14 03:28:14 localhost network[21931]: Bringing up loopback interface: [ OK ] Oct 14 03:28:14 localhost NetworkManager[5977]: [1760426894.6292] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22060 uid=0 result="success" Oct 14 03:28:14 localhost network[21931]: Bringing up interface eth0: [ OK ] Oct 14 03:28:14 localhost systemd[1]: Started LSB: Bring up/down networking. Oct 14 03:28:15 localhost python3[22101]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 03:28:15 localhost systemd[1]: Starting Open vSwitch Database Unit... Oct 14 03:28:15 localhost chown[22105]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Oct 14 03:28:15 localhost ovs-ctl[22110]: /etc/openvswitch/conf.db does not exist ... (warning). Oct 14 03:28:15 localhost ovs-ctl[22110]: Creating empty database /etc/openvswitch/conf.db [ OK ] Oct 14 03:28:15 localhost ovs-ctl[22110]: Starting ovsdb-server [ OK ] Oct 14 03:28:15 localhost ovs-vsctl[22160]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Oct 14 03:28:15 localhost ovs-vsctl[22180]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-110.el9fdp "external-ids:system-id=\"9e4b0f79-1220-4c7d-a18d-fa1a88dab362\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Oct 14 03:28:15 localhost ovs-ctl[22110]: Configuring Open vSwitch system IDs [ OK ] Oct 14 03:28:15 localhost ovs-ctl[22110]: Enabling remote OVSDB managers [ OK ] Oct 14 03:28:15 localhost systemd[1]: Started Open vSwitch Database Unit. Oct 14 03:28:15 localhost ovs-vsctl[22186]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005486733.novalocal Oct 14 03:28:15 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Oct 14 03:28:15 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Oct 14 03:28:15 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Oct 14 03:28:15 localhost kernel: openvswitch: Open vSwitch switching datapath Oct 14 03:28:15 localhost ovs-ctl[22231]: Inserting openvswitch module [ OK ] Oct 14 03:28:15 localhost ovs-ctl[22199]: Starting ovs-vswitchd [ OK ] Oct 14 03:28:15 localhost ovs-vsctl[22250]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005486733.novalocal Oct 14 03:28:15 localhost ovs-ctl[22199]: Enabling remote OVSDB managers [ OK ] Oct 14 03:28:15 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Oct 14 03:28:15 localhost systemd[1]: Starting Open vSwitch... Oct 14 03:28:15 localhost systemd[1]: Finished Open vSwitch. Oct 14 03:28:46 localhost python3[22268]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f848-5676-00000000001a-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:28:47 localhost NetworkManager[5977]: [1760426927.7330] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22427 uid=0 result="success" Oct 14 03:28:47 localhost ifup[22428]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:28:47 localhost ifup[22429]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:47 localhost ifup[22430]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:28:47 localhost NetworkManager[5977]: [1760426927.7580] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22436 uid=0 result="success" Oct 14 03:28:47 localhost ovs-vsctl[22438]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:c0:7d:39 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Oct 14 03:28:47 localhost kernel: device ovs-system entered promiscuous mode Oct 14 03:28:47 localhost NetworkManager[5977]: [1760426927.8230] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Oct 14 03:28:47 localhost kernel: Timeout policy base is empty Oct 14 03:28:47 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Oct 14 03:28:47 localhost systemd-udevd[22439]: Network interface NamePolicy= disabled on kernel command line. Oct 14 03:28:47 localhost kernel: device br-ex entered promiscuous mode Oct 14 03:28:47 localhost NetworkManager[5977]: [1760426927.8648] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Oct 14 03:28:47 localhost NetworkManager[5977]: [1760426927.8905] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22464 uid=0 result="success" Oct 14 03:28:47 localhost NetworkManager[5977]: [1760426927.9067] device (br-ex): carrier: link connected Oct 14 03:28:50 localhost NetworkManager[5977]: [1760426930.9537] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22493 uid=0 result="success" Oct 14 03:28:50 localhost NetworkManager[5977]: [1760426930.9980] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22508 uid=0 result="success" Oct 14 03:28:51 localhost NET[22533]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.0937] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.1067] dhcp4 (eth1): canceled DHCP transaction Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.1067] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.1068] dhcp4 (eth1): state changed no lease Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.1111] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22542 uid=0 result="success" Oct 14 03:28:51 localhost ifup[22543]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:28:51 localhost ifup[22544]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:51 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Oct 14 03:28:51 localhost ifup[22546]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:28:51 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.1490] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22559 uid=0 result="success" Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.1980] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22570 uid=0 result="success" Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.2059] device (eth1): carrier: link connected Oct 14 03:28:51 localhost NetworkManager[5977]: [1760426931.2293] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22579 uid=0 result="success" Oct 14 03:28:51 localhost ipv6_wait_tentative[22591]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Oct 14 03:28:52 localhost ipv6_wait_tentative[22596]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.3012] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22605 uid=0 result="success" Oct 14 03:28:53 localhost ovs-vsctl[22620]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Oct 14 03:28:53 localhost kernel: device eth1 entered promiscuous mode Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.3722] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22628 uid=0 result="success" Oct 14 03:28:53 localhost ifup[22629]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:28:53 localhost ifup[22630]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:53 localhost ifup[22631]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.4048] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22637 uid=0 result="success" Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.4465] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22647 uid=0 result="success" Oct 14 03:28:53 localhost ifup[22648]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:28:53 localhost ifup[22649]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:53 localhost ifup[22650]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.4815] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22656 uid=0 result="success" Oct 14 03:28:53 localhost ovs-vsctl[22659]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Oct 14 03:28:53 localhost kernel: device vlan20 entered promiscuous mode Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.5240] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Oct 14 03:28:53 localhost systemd-udevd[22661]: Network interface NamePolicy= disabled on kernel command line. Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.5512] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22670 uid=0 result="success" Oct 14 03:28:53 localhost NetworkManager[5977]: [1760426933.5734] device (vlan20): carrier: link connected Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.6211] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22699 uid=0 result="success" Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.6679] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22714 uid=0 result="success" Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.7267] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22735 uid=0 result="success" Oct 14 03:28:56 localhost ifup[22736]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:28:56 localhost ifup[22737]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:28:56 localhost ifup[22738]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.7599] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22744 uid=0 result="success" Oct 14 03:28:56 localhost ovs-vsctl[22747]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Oct 14 03:28:56 localhost kernel: device vlan21 entered promiscuous mode Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.8351] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Oct 14 03:28:56 localhost systemd-udevd[22749]: Network interface NamePolicy= disabled on kernel command line. Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.8604] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22759 uid=0 result="success" Oct 14 03:28:56 localhost NetworkManager[5977]: [1760426936.8811] device (vlan21): carrier: link connected Oct 14 03:28:59 localhost NetworkManager[5977]: [1760426939.9265] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22790 uid=0 result="success" Oct 14 03:28:59 localhost NetworkManager[5977]: [1760426939.9697] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22805 uid=0 result="success" Oct 14 03:29:00 localhost NetworkManager[5977]: [1760426940.0222] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22826 uid=0 result="success" Oct 14 03:29:00 localhost ifup[22827]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:00 localhost ifup[22828]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:00 localhost ifup[22829]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:00 localhost NetworkManager[5977]: [1760426940.0531] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22835 uid=0 result="success" Oct 14 03:29:00 localhost ovs-vsctl[22838]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Oct 14 03:29:00 localhost kernel: device vlan22 entered promiscuous mode Oct 14 03:29:00 localhost systemd-udevd[22840]: Network interface NamePolicy= disabled on kernel command line. Oct 14 03:29:00 localhost NetworkManager[5977]: [1760426940.0919] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Oct 14 03:29:00 localhost NetworkManager[5977]: [1760426940.1152] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22850 uid=0 result="success" Oct 14 03:29:00 localhost NetworkManager[5977]: [1760426940.1358] device (vlan22): carrier: link connected Oct 14 03:29:01 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.2129] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22881 uid=0 result="success" Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.2584] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22896 uid=0 result="success" Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.3182] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22917 uid=0 result="success" Oct 14 03:29:03 localhost ifup[22918]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:03 localhost ifup[22919]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:03 localhost ifup[22920]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.3515] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22926 uid=0 result="success" Oct 14 03:29:03 localhost ovs-vsctl[22929]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Oct 14 03:29:03 localhost systemd-udevd[22931]: Network interface NamePolicy= disabled on kernel command line. Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.3934] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Oct 14 03:29:03 localhost kernel: device vlan23 entered promiscuous mode Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.4208] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22941 uid=0 result="success" Oct 14 03:29:03 localhost NetworkManager[5977]: [1760426943.4433] device (vlan23): carrier: link connected Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.4942] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22971 uid=0 result="success" Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.5347] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22986 uid=0 result="success" Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.5878] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23007 uid=0 result="success" Oct 14 03:29:06 localhost ifup[23008]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:06 localhost ifup[23009]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:06 localhost ifup[23010]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.6125] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23016 uid=0 result="success" Oct 14 03:29:06 localhost ovs-vsctl[23019]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Oct 14 03:29:06 localhost systemd-udevd[23021]: Network interface NamePolicy= disabled on kernel command line. Oct 14 03:29:06 localhost kernel: device vlan44 entered promiscuous mode Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.6847] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.7105] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23031 uid=0 result="success" Oct 14 03:29:06 localhost NetworkManager[5977]: [1760426946.7305] device (vlan44): carrier: link connected Oct 14 03:29:09 localhost NetworkManager[5977]: [1760426949.7765] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23061 uid=0 result="success" Oct 14 03:29:09 localhost NetworkManager[5977]: [1760426949.8174] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23076 uid=0 result="success" Oct 14 03:29:09 localhost NetworkManager[5977]: [1760426949.8640] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23097 uid=0 result="success" Oct 14 03:29:09 localhost ifup[23098]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:09 localhost ifup[23099]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:09 localhost ifup[23100]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:09 localhost NetworkManager[5977]: [1760426949.8862] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23106 uid=0 result="success" Oct 14 03:29:09 localhost ovs-vsctl[23109]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Oct 14 03:29:09 localhost NetworkManager[5977]: [1760426949.9428] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23116 uid=0 result="success" Oct 14 03:29:10 localhost NetworkManager[5977]: [1760426950.9973] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23143 uid=0 result="success" Oct 14 03:29:11 localhost NetworkManager[5977]: [1760426951.0426] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23158 uid=0 result="success" Oct 14 03:29:11 localhost NetworkManager[5977]: [1760426951.0972] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23179 uid=0 result="success" Oct 14 03:29:11 localhost ifup[23180]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:11 localhost ifup[23181]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:11 localhost ifup[23182]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:11 localhost NetworkManager[5977]: [1760426951.1252] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23188 uid=0 result="success" Oct 14 03:29:11 localhost ovs-vsctl[23191]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Oct 14 03:29:11 localhost NetworkManager[5977]: [1760426951.1772] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23198 uid=0 result="success" Oct 14 03:29:12 localhost NetworkManager[5977]: [1760426952.2310] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23226 uid=0 result="success" Oct 14 03:29:12 localhost NetworkManager[5977]: [1760426952.2794] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23241 uid=0 result="success" Oct 14 03:29:12 localhost NetworkManager[5977]: [1760426952.3414] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23262 uid=0 result="success" Oct 14 03:29:12 localhost ifup[23263]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:12 localhost ifup[23264]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:12 localhost ifup[23265]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:12 localhost NetworkManager[5977]: [1760426952.3751] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23271 uid=0 result="success" Oct 14 03:29:12 localhost ovs-vsctl[23274]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Oct 14 03:29:12 localhost NetworkManager[5977]: [1760426952.4809] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23281 uid=0 result="success" Oct 14 03:29:13 localhost NetworkManager[5977]: [1760426953.5400] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23309 uid=0 result="success" Oct 14 03:29:13 localhost NetworkManager[5977]: [1760426953.5898] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23324 uid=0 result="success" Oct 14 03:29:13 localhost NetworkManager[5977]: [1760426953.6420] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23345 uid=0 result="success" Oct 14 03:29:13 localhost ifup[23346]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:13 localhost ifup[23347]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:13 localhost ifup[23348]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:13 localhost NetworkManager[5977]: [1760426953.6678] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23354 uid=0 result="success" Oct 14 03:29:13 localhost ovs-vsctl[23357]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Oct 14 03:29:13 localhost NetworkManager[5977]: [1760426953.7193] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23364 uid=0 result="success" Oct 14 03:29:14 localhost NetworkManager[5977]: [1760426954.7748] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23392 uid=0 result="success" Oct 14 03:29:14 localhost NetworkManager[5977]: [1760426954.8256] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23407 uid=0 result="success" Oct 14 03:29:14 localhost NetworkManager[5977]: [1760426954.8912] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23428 uid=0 result="success" Oct 14 03:29:14 localhost ifup[23429]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Oct 14 03:29:14 localhost ifup[23430]: 'network-scripts' will be removed from distribution in near future. Oct 14 03:29:14 localhost ifup[23431]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Oct 14 03:29:14 localhost NetworkManager[5977]: [1760426954.9248] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23437 uid=0 result="success" Oct 14 03:29:14 localhost ovs-vsctl[23440]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Oct 14 03:29:15 localhost NetworkManager[5977]: [1760426955.0169] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23447 uid=0 result="success" Oct 14 03:29:16 localhost NetworkManager[5977]: [1760426956.0834] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23475 uid=0 result="success" Oct 14 03:29:16 localhost NetworkManager[5977]: [1760426956.1335] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23490 uid=0 result="success" Oct 14 03:29:42 localhost python3[23522]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f848-5676-00000000001b-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:29:46 localhost python3[23541]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:29:46 localhost python3[23557]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:29:48 localhost python3[23571]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:29:48 localhost python3[23587]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Oct 14 03:29:49 localhost python3[23601]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Oct 14 03:29:50 localhost python3[23616]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005486733.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f848-5676-000000000022-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:29:51 localhost python3[23636]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f848-5676-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:29:51 localhost systemd[1]: Starting Hostname Service... Oct 14 03:29:51 localhost systemd[1]: Started Hostname Service. Oct 14 03:29:51 localhost systemd-hostnamed[23640]: Hostname set to (static) Oct 14 03:29:51 localhost NetworkManager[5977]: [1760426991.3749] hostname: static hostname changed from "np0005486733.novalocal" to "np0005486733.localdomain" Oct 14 03:29:51 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Oct 14 03:29:51 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Oct 14 03:29:52 localhost systemd[1]: session-10.scope: Deactivated successfully. Oct 14 03:29:52 localhost systemd-logind[760]: Session 10 logged out. Waiting for processes to exit. Oct 14 03:29:52 localhost systemd[1]: session-10.scope: Consumed 1min 42.975s CPU time. Oct 14 03:29:52 localhost systemd-logind[760]: Removed session 10. Oct 14 03:29:55 localhost sshd[23651]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:29:55 localhost systemd-logind[760]: New session 11 of user zuul. Oct 14 03:29:55 localhost systemd[1]: Started Session 11 of User zuul. Oct 14 03:29:55 localhost python3[23668]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Oct 14 03:29:57 localhost systemd[1]: session-11.scope: Deactivated successfully. Oct 14 03:29:57 localhost systemd-logind[760]: Session 11 logged out. Waiting for processes to exit. Oct 14 03:29:57 localhost systemd-logind[760]: Removed session 11. Oct 14 03:30:01 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Oct 14 03:30:21 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Oct 14 03:30:56 localhost sshd[23671]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:30:56 localhost systemd-logind[760]: New session 12 of user zuul. Oct 14 03:30:56 localhost systemd[1]: Started Session 12 of User zuul. Oct 14 03:30:56 localhost python3[23690]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:31:00 localhost systemd[1]: Reloading. Oct 14 03:31:00 localhost systemd-rc-local-generator[23731]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:31:00 localhost systemd-sysv-generator[23735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:31:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:31:00 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Oct 14 03:31:00 localhost systemd[1]: Reloading. Oct 14 03:31:00 localhost systemd-rc-local-generator[23774]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:31:00 localhost systemd-sysv-generator[23778]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:31:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:31:01 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Oct 14 03:31:01 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Oct 14 03:31:01 localhost systemd[1]: Reloading. Oct 14 03:31:01 localhost systemd-rc-local-generator[23812]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:31:01 localhost systemd-sysv-generator[23818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:31:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:31:01 localhost systemd[1]: Listening on LVM2 poll daemon socket. Oct 14 03:31:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:31:01 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 03:31:01 localhost systemd[1]: Reloading. Oct 14 03:31:01 localhost systemd-rc-local-generator[23864]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:31:01 localhost systemd-sysv-generator[23868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:31:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:31:02 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 03:31:02 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:31:02 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 03:31:02 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 03:31:02 localhost systemd[1]: run-r73da78879b6147dfa9fcdc33a6fdd905.service: Deactivated successfully. Oct 14 03:31:02 localhost systemd[1]: run-r53d2f83812a744c69141bca69b2d88a6.service: Deactivated successfully. Oct 14 03:32:03 localhost systemd[1]: session-12.scope: Deactivated successfully. Oct 14 03:32:03 localhost systemd[1]: session-12.scope: Consumed 4.576s CPU time. Oct 14 03:32:03 localhost systemd-logind[760]: Session 12 logged out. Waiting for processes to exit. Oct 14 03:32:03 localhost systemd-logind[760]: Removed session 12. Oct 14 03:47:58 localhost sshd[24471]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:47:58 localhost systemd-logind[760]: New session 13 of user zuul. Oct 14 03:47:58 localhost systemd[1]: Started Session 13 of User zuul. Oct 14 03:47:58 localhost python3[24519]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 03:48:00 localhost python3[24606]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:48:03 localhost python3[24623]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:48:04 localhost python3[24640]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:04 localhost kernel: loop: module loaded Oct 14 03:48:04 localhost kernel: loop3: detected capacity change from 0 to 14680064 Oct 14 03:48:04 localhost python3[24665]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:04 localhost lvm[24668]: PV /dev/loop3 not used. Oct 14 03:48:04 localhost lvm[24671]: PV /dev/loop3 online, VG ceph_vg0 is complete. Oct 14 03:48:05 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Oct 14 03:48:05 localhost lvm[24681]: PV /dev/loop3 online, VG ceph_vg0 is complete. Oct 14 03:48:05 localhost lvm[24681]: VG ceph_vg0 finished Oct 14 03:48:05 localhost lvm[24679]: 1 logical volume(s) in volume group "ceph_vg0" now active Oct 14 03:48:05 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Oct 14 03:48:05 localhost python3[24730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:48:06 localhost python3[24773]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760428085.3665173-55036-234795925063982/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:06 localhost python3[24803]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 03:48:07 localhost systemd[1]: Reloading. Oct 14 03:48:07 localhost systemd-rc-local-generator[24832]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:48:07 localhost systemd-sysv-generator[24836]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:48:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:48:07 localhost systemd[1]: Starting Ceph OSD losetup... Oct 14 03:48:07 localhost bash[24844]: /dev/loop3: [64516]:8399529 (/var/lib/ceph-osd-0.img) Oct 14 03:48:07 localhost systemd[1]: Finished Ceph OSD losetup. Oct 14 03:48:07 localhost lvm[24845]: PV /dev/loop3 online, VG ceph_vg0 is complete. Oct 14 03:48:07 localhost lvm[24845]: VG ceph_vg0 finished Oct 14 03:48:08 localhost python3[24861]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:48:11 localhost python3[24878]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:48:12 localhost python3[24894]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:12 localhost kernel: loop4: detected capacity change from 0 to 14680064 Oct 14 03:48:13 localhost python3[24916]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:13 localhost lvm[24919]: PV /dev/loop4 not used. Oct 14 03:48:13 localhost lvm[24921]: PV /dev/loop4 online, VG ceph_vg1 is complete. Oct 14 03:48:13 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Oct 14 03:48:13 localhost lvm[24928]: 1 logical volume(s) in volume group "ceph_vg1" now active Oct 14 03:48:13 localhost lvm[24932]: PV /dev/loop4 online, VG ceph_vg1 is complete. Oct 14 03:48:13 localhost lvm[24932]: VG ceph_vg1 finished Oct 14 03:48:13 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Oct 14 03:48:13 localhost python3[24980]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:48:14 localhost python3[25023]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760428093.4814517-55206-184037185555650/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:14 localhost python3[25053]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 03:48:14 localhost systemd[1]: Reloading. Oct 14 03:48:14 localhost systemd-sysv-generator[25081]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:48:14 localhost systemd-rc-local-generator[25076]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:48:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:48:15 localhost systemd[1]: Starting Ceph OSD losetup... Oct 14 03:48:15 localhost bash[25093]: /dev/loop4: [64516]:9185253 (/var/lib/ceph-osd-1.img) Oct 14 03:48:15 localhost systemd[1]: Finished Ceph OSD losetup. Oct 14 03:48:15 localhost lvm[25094]: PV /dev/loop4 online, VG ceph_vg1 is complete. Oct 14 03:48:15 localhost lvm[25094]: VG ceph_vg1 finished Oct 14 03:48:23 localhost python3[25140]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Oct 14 03:48:24 localhost python3[25160]: ansible-hostname Invoked with name=np0005486733.localdomain use=None Oct 14 03:48:24 localhost systemd[1]: Starting Hostname Service... Oct 14 03:48:24 localhost systemd[1]: Started Hostname Service. Oct 14 03:48:27 localhost python3[25183]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Oct 14 03:48:27 localhost python3[25231]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.cz0bl9cztmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:28 localhost python3[25261]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.cz0bl9cztmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:28 localhost python3[25277]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.cz0bl9cztmphosts insertbefore=BOF block=192.168.122.106 np0005486731.localdomain np0005486731#012192.168.122.106 np0005486731.ctlplane.localdomain np0005486731.ctlplane#012192.168.122.107 np0005486732.localdomain np0005486732#012192.168.122.107 np0005486732.ctlplane.localdomain np0005486732.ctlplane#012192.168.122.108 np0005486733.localdomain np0005486733#012192.168.122.108 np0005486733.ctlplane.localdomain np0005486733.ctlplane#012192.168.122.103 np0005486728.localdomain np0005486728#012192.168.122.103 np0005486728.ctlplane.localdomain np0005486728.ctlplane#012192.168.122.104 np0005486729.localdomain np0005486729#012192.168.122.104 np0005486729.ctlplane.localdomain np0005486729.ctlplane#012192.168.122.105 np0005486730.localdomain np0005486730#012192.168.122.105 np0005486730.ctlplane.localdomain np0005486730.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:29 localhost python3[25293]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.cz0bl9cztmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:29 localhost python3[25310]: ansible-file Invoked with path=/tmp/ansible.cz0bl9cztmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:31 localhost python3[25326]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:32 localhost python3[25344]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:48:36 localhost python3[25393]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:48:37 localhost python3[25438]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760428116.4407268-56061-144438239082951/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:38 localhost python3[25468]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 03:48:39 localhost python3[25486]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 03:48:39 localhost chronyd[766]: chronyd exiting Oct 14 03:48:39 localhost systemd[1]: Stopping NTP client/server... Oct 14 03:48:39 localhost systemd[1]: chronyd.service: Deactivated successfully. Oct 14 03:48:39 localhost systemd[1]: Stopped NTP client/server. Oct 14 03:48:39 localhost systemd[1]: chronyd.service: Consumed 98ms CPU time, read 1.9M from disk, written 0B to disk. Oct 14 03:48:39 localhost systemd[1]: Starting NTP client/server... Oct 14 03:48:39 localhost chronyd[25493]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Oct 14 03:48:39 localhost chronyd[25493]: Frequency -30.415 +/- 0.046 ppm read from /var/lib/chrony/drift Oct 14 03:48:39 localhost chronyd[25493]: Loaded seccomp filter (level 2) Oct 14 03:48:39 localhost systemd[1]: Started NTP client/server. Oct 14 03:48:41 localhost python3[25542]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:48:41 localhost python3[25585]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760428120.7806895-56293-63082571519306/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:48:42 localhost python3[25615]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 03:48:42 localhost systemd[1]: Reloading. Oct 14 03:48:42 localhost systemd-sysv-generator[25639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:48:42 localhost systemd-rc-local-generator[25636]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:48:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:48:42 localhost systemd[1]: Reloading. Oct 14 03:48:42 localhost systemd-rc-local-generator[25678]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:48:42 localhost systemd-sysv-generator[25683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:48:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:48:42 localhost systemd[1]: Starting chronyd online sources service... Oct 14 03:48:42 localhost chronyc[25691]: 200 OK Oct 14 03:48:42 localhost systemd[1]: chrony-online.service: Deactivated successfully. Oct 14 03:48:42 localhost systemd[1]: Finished chronyd online sources service. Oct 14 03:48:43 localhost python3[25708]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:43 localhost chronyd[25493]: System clock was stepped by 0.000000 seconds Oct 14 03:48:43 localhost python3[25725]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:48:43 localhost chronyd[25493]: Selected source 216.232.132.102 (pool.ntp.org) Oct 14 03:48:54 localhost python3[25743]: ansible-timezone Invoked with name=UTC hwclock=None Oct 14 03:48:54 localhost systemd[1]: Starting Time & Date Service... Oct 14 03:48:54 localhost systemd[1]: Started Time & Date Service. Oct 14 03:48:54 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Oct 14 03:48:56 localhost python3[25765]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 03:48:56 localhost chronyd[25493]: chronyd exiting Oct 14 03:48:56 localhost systemd[1]: Stopping NTP client/server... Oct 14 03:48:56 localhost systemd[1]: chronyd.service: Deactivated successfully. Oct 14 03:48:56 localhost systemd[1]: Stopped NTP client/server. Oct 14 03:48:56 localhost systemd[1]: Starting NTP client/server... Oct 14 03:48:56 localhost chronyd[25772]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Oct 14 03:48:56 localhost chronyd[25772]: Frequency -30.415 +/- 0.048 ppm read from /var/lib/chrony/drift Oct 14 03:48:56 localhost chronyd[25772]: Loaded seccomp filter (level 2) Oct 14 03:48:56 localhost systemd[1]: Started NTP client/server. Oct 14 03:49:00 localhost chronyd[25772]: Selected source 162.159.200.1 (pool.ntp.org) Oct 14 03:49:24 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Oct 14 03:51:03 localhost sshd[25969]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:03 localhost systemd-logind[760]: New session 14 of user ceph-admin. Oct 14 03:51:03 localhost systemd[1]: Created slice User Slice of UID 1002. Oct 14 03:51:03 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Oct 14 03:51:03 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Oct 14 03:51:03 localhost systemd[1]: Starting User Manager for UID 1002... Oct 14 03:51:03 localhost sshd[25986]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:03 localhost systemd[25973]: Queued start job for default target Main User Target. Oct 14 03:51:03 localhost systemd[25973]: Created slice User Application Slice. Oct 14 03:51:03 localhost systemd[25973]: Started Mark boot as successful after the user session has run 2 minutes. Oct 14 03:51:03 localhost systemd[25973]: Started Daily Cleanup of User's Temporary Directories. Oct 14 03:51:03 localhost systemd[25973]: Reached target Paths. Oct 14 03:51:03 localhost systemd[25973]: Reached target Timers. Oct 14 03:51:03 localhost systemd-logind[760]: New session 16 of user ceph-admin. Oct 14 03:51:03 localhost systemd[25973]: Starting D-Bus User Message Bus Socket... Oct 14 03:51:03 localhost systemd[25973]: Starting Create User's Volatile Files and Directories... Oct 14 03:51:03 localhost systemd[25973]: Listening on D-Bus User Message Bus Socket. Oct 14 03:51:03 localhost systemd[25973]: Reached target Sockets. Oct 14 03:51:03 localhost systemd[25973]: Finished Create User's Volatile Files and Directories. Oct 14 03:51:03 localhost systemd[25973]: Reached target Basic System. Oct 14 03:51:03 localhost systemd[25973]: Reached target Main User Target. Oct 14 03:51:03 localhost systemd[25973]: Startup finished in 122ms. Oct 14 03:51:03 localhost systemd[1]: Started User Manager for UID 1002. Oct 14 03:51:03 localhost systemd[1]: Started Session 14 of User ceph-admin. Oct 14 03:51:03 localhost systemd[1]: Started Session 16 of User ceph-admin. Oct 14 03:51:04 localhost sshd[26008]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:04 localhost systemd-logind[760]: New session 17 of user ceph-admin. Oct 14 03:51:04 localhost systemd[1]: Started Session 17 of User ceph-admin. Oct 14 03:51:04 localhost sshd[26027]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:04 localhost systemd-logind[760]: New session 18 of user ceph-admin. Oct 14 03:51:04 localhost systemd[1]: Started Session 18 of User ceph-admin. Oct 14 03:51:04 localhost sshd[26046]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:04 localhost systemd-logind[760]: New session 19 of user ceph-admin. Oct 14 03:51:04 localhost systemd[1]: Started Session 19 of User ceph-admin. Oct 14 03:51:05 localhost sshd[26065]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:05 localhost systemd-logind[760]: New session 20 of user ceph-admin. Oct 14 03:51:05 localhost systemd[1]: Started Session 20 of User ceph-admin. Oct 14 03:51:05 localhost sshd[26084]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:05 localhost systemd-logind[760]: New session 21 of user ceph-admin. Oct 14 03:51:05 localhost systemd[1]: Started Session 21 of User ceph-admin. Oct 14 03:51:05 localhost sshd[26103]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:05 localhost systemd-logind[760]: New session 22 of user ceph-admin. Oct 14 03:51:05 localhost systemd[1]: Started Session 22 of User ceph-admin. Oct 14 03:51:06 localhost sshd[26122]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:06 localhost systemd-logind[760]: New session 23 of user ceph-admin. Oct 14 03:51:06 localhost systemd[1]: Started Session 23 of User ceph-admin. Oct 14 03:51:06 localhost sshd[26141]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:06 localhost systemd-logind[760]: New session 24 of user ceph-admin. Oct 14 03:51:06 localhost systemd[1]: Started Session 24 of User ceph-admin. Oct 14 03:51:07 localhost sshd[26158]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:07 localhost systemd-logind[760]: New session 25 of user ceph-admin. Oct 14 03:51:07 localhost systemd[1]: Started Session 25 of User ceph-admin. Oct 14 03:51:07 localhost sshd[26177]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:51:07 localhost systemd-logind[760]: New session 26 of user ceph-admin. Oct 14 03:51:07 localhost systemd[1]: Started Session 26 of User ceph-admin. Oct 14 03:51:08 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:22 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:22 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:23 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:23 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:23 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26393 (sysctl) Oct 14 03:51:23 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Oct 14 03:51:23 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Oct 14 03:51:24 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:24 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:24 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:28 localhost kernel: VFS: idmapped mount is not enabled. Oct 14 03:51:48 localhost podman[26529]: Oct 14 03:51:48 localhost podman[26529]: 2025-10-14 07:51:24.695945515 +0000 UTC m=+0.039891421 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:51:48 localhost podman[26529]: 2025-10-14 07:51:48.065666164 +0000 UTC m=+23.409612040 container create 30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hypatia, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, CEPH_POINT_RELEASE=, version=7, io.buildah.version=1.33.12) Oct 14 03:51:48 localhost systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck1008937431-merged.mount: Deactivated successfully. Oct 14 03:51:48 localhost systemd[1]: Created slice Slice /machine. Oct 14 03:51:48 localhost systemd[1]: Started libpod-conmon-30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea.scope. Oct 14 03:51:48 localhost systemd[1]: Started libcrun container. Oct 14 03:51:48 localhost podman[26529]: 2025-10-14 07:51:48.210383525 +0000 UTC m=+23.554329431 container init 30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hypatia, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 03:51:48 localhost podman[26529]: 2025-10-14 07:51:48.223317673 +0000 UTC m=+23.567263549 container start 30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hypatia, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, RELEASE=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 03:51:48 localhost podman[26529]: 2025-10-14 07:51:48.223675114 +0000 UTC m=+23.567621050 container attach 30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hypatia, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7) Oct 14 03:51:48 localhost strange_hypatia[26788]: 167 167 Oct 14 03:51:48 localhost systemd[1]: libpod-30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea.scope: Deactivated successfully. Oct 14 03:51:48 localhost podman[26529]: 2025-10-14 07:51:48.227654757 +0000 UTC m=+23.571600673 container died 30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hypatia, name=rhceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-type=git, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Oct 14 03:51:48 localhost podman[26793]: 2025-10-14 07:51:48.326841434 +0000 UTC m=+0.083457394 container remove 30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hypatia, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, release=553, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7) Oct 14 03:51:48 localhost systemd[1]: libpod-conmon-30642bd59eb27cdd49dbd3c06799190cc02f36cf9d63071f9bd5b79295446dea.scope: Deactivated successfully. Oct 14 03:51:48 localhost podman[26813]: Oct 14 03:51:48 localhost podman[26813]: 2025-10-14 07:51:48.52198386 +0000 UTC m=+0.040093057 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:51:49 localhost systemd[1]: tmp-crun.wJAOff.mount: Deactivated successfully. Oct 14 03:51:49 localhost systemd[1]: var-lib-containers-storage-overlay-c6e0c8752057e4d2c2cd4d67dae9cff119eaef57351a6988071af39f0e424b14-merged.mount: Deactivated successfully. Oct 14 03:51:51 localhost podman[26813]: 2025-10-14 07:51:51.757810872 +0000 UTC m=+3.275920059 container create 7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_villani, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-type=git, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 03:51:51 localhost systemd[1]: Started libpod-conmon-7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc.scope. Oct 14 03:51:51 localhost systemd[1]: Started libcrun container. Oct 14 03:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c45b08843b39847612b2c9d98d4af69344a5db451e81bc23ec75214c46405b7f/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c45b08843b39847612b2c9d98d4af69344a5db451e81bc23ec75214c46405b7f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:51:51 localhost podman[26813]: 2025-10-14 07:51:51.861721496 +0000 UTC m=+3.379830713 container init 7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_villani, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, io.openshift.expose-services=, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Oct 14 03:51:51 localhost podman[26813]: 2025-10-14 07:51:51.872160649 +0000 UTC m=+3.390269866 container start 7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_villani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main) Oct 14 03:51:51 localhost podman[26813]: 2025-10-14 07:51:51.873494409 +0000 UTC m=+3.391603696 container attach 7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_villani, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, distribution-scope=public) Oct 14 03:51:52 localhost boring_villani[27035]: [ Oct 14 03:51:52 localhost boring_villani[27035]: { Oct 14 03:51:52 localhost boring_villani[27035]: "available": false, Oct 14 03:51:52 localhost boring_villani[27035]: "ceph_device": false, Oct 14 03:51:52 localhost boring_villani[27035]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 03:51:52 localhost boring_villani[27035]: "lsm_data": {}, Oct 14 03:51:52 localhost boring_villani[27035]: "lvs": [], Oct 14 03:51:52 localhost boring_villani[27035]: "path": "/dev/sr0", Oct 14 03:51:52 localhost boring_villani[27035]: "rejected_reasons": [ Oct 14 03:51:52 localhost boring_villani[27035]: "Insufficient space (<5GB)", Oct 14 03:51:52 localhost boring_villani[27035]: "Has a FileSystem" Oct 14 03:51:52 localhost boring_villani[27035]: ], Oct 14 03:51:52 localhost boring_villani[27035]: "sys_api": { Oct 14 03:51:52 localhost boring_villani[27035]: "actuators": null, Oct 14 03:51:52 localhost boring_villani[27035]: "device_nodes": "sr0", Oct 14 03:51:52 localhost boring_villani[27035]: "human_readable_size": "482.00 KB", Oct 14 03:51:52 localhost boring_villani[27035]: "id_bus": "ata", Oct 14 03:51:52 localhost boring_villani[27035]: "model": "QEMU DVD-ROM", Oct 14 03:51:52 localhost boring_villani[27035]: "nr_requests": "2", Oct 14 03:51:52 localhost boring_villani[27035]: "partitions": {}, Oct 14 03:51:52 localhost boring_villani[27035]: "path": "/dev/sr0", Oct 14 03:51:52 localhost boring_villani[27035]: "removable": "1", Oct 14 03:51:52 localhost boring_villani[27035]: "rev": "2.5+", Oct 14 03:51:52 localhost boring_villani[27035]: "ro": "0", Oct 14 03:51:52 localhost boring_villani[27035]: "rotational": "1", Oct 14 03:51:52 localhost boring_villani[27035]: "sas_address": "", Oct 14 03:51:52 localhost boring_villani[27035]: "sas_device_handle": "", Oct 14 03:51:52 localhost boring_villani[27035]: "scheduler_mode": "mq-deadline", Oct 14 03:51:52 localhost boring_villani[27035]: "sectors": 0, Oct 14 03:51:52 localhost boring_villani[27035]: "sectorsize": "2048", Oct 14 03:51:52 localhost boring_villani[27035]: "size": 493568.0, Oct 14 03:51:52 localhost boring_villani[27035]: "support_discard": "0", Oct 14 03:51:52 localhost boring_villani[27035]: "type": "disk", Oct 14 03:51:52 localhost boring_villani[27035]: "vendor": "QEMU" Oct 14 03:51:52 localhost boring_villani[27035]: } Oct 14 03:51:52 localhost boring_villani[27035]: } Oct 14 03:51:52 localhost boring_villani[27035]: ] Oct 14 03:51:52 localhost systemd[1]: libpod-7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc.scope: Deactivated successfully. Oct 14 03:51:52 localhost podman[26813]: 2025-10-14 07:51:52.648334134 +0000 UTC m=+4.166443391 container died 7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_villani, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, GIT_BRANCH=main) Oct 14 03:51:52 localhost podman[28260]: 2025-10-14 07:51:52.743136266 +0000 UTC m=+0.082493874 container remove 7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_villani, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 03:51:52 localhost systemd[1]: libpod-conmon-7a61e91764278485ad754c2fb73e8d40dac00aa2c045306103c15f0d19f495dc.scope: Deactivated successfully. Oct 14 03:51:52 localhost systemd[1]: tmp-crun.YXg7bX.mount: Deactivated successfully. Oct 14 03:51:52 localhost systemd[1]: var-lib-containers-storage-overlay-c45b08843b39847612b2c9d98d4af69344a5db451e81bc23ec75214c46405b7f-merged.mount: Deactivated successfully. Oct 14 03:51:52 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:53 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:51:53 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Oct 14 03:51:53 localhost systemd[1]: Closed Process Core Dump Socket. Oct 14 03:51:53 localhost systemd[1]: Stopping Process Core Dump Socket... Oct 14 03:51:53 localhost systemd[1]: Listening on Process Core Dump Socket. Oct 14 03:51:53 localhost systemd[1]: Reloading. Oct 14 03:51:53 localhost systemd-rc-local-generator[28337]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:51:53 localhost systemd-sysv-generator[28343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:51:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:51:53 localhost systemd[1]: Reloading. Oct 14 03:51:53 localhost systemd-sysv-generator[28382]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:51:53 localhost systemd-rc-local-generator[28379]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:51:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:19 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:52:19 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:52:20 localhost podman[28465]: Oct 14 03:52:20 localhost podman[28465]: 2025-10-14 07:52:20.115991714 +0000 UTC m=+0.075140729 container create 5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_ganguly, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, RELEASE=main) Oct 14 03:52:20 localhost systemd[1]: Started libpod-conmon-5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc.scope. Oct 14 03:52:20 localhost systemd[1]: Started libcrun container. Oct 14 03:52:20 localhost podman[28465]: 2025-10-14 07:52:20.081997249 +0000 UTC m=+0.041146354 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:20 localhost podman[28465]: 2025-10-14 07:52:20.188879626 +0000 UTC m=+0.148028611 container init 5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_ganguly, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, release=553, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12) Oct 14 03:52:20 localhost podman[28465]: 2025-10-14 07:52:20.198769012 +0000 UTC m=+0.157918027 container start 5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_ganguly, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public, release=553, GIT_CLEAN=True) Oct 14 03:52:20 localhost podman[28465]: 2025-10-14 07:52:20.199048925 +0000 UTC m=+0.158197910 container attach 5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_ganguly, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main) Oct 14 03:52:20 localhost silly_ganguly[28481]: 167 167 Oct 14 03:52:20 localhost systemd[1]: libpod-5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc.scope: Deactivated successfully. Oct 14 03:52:20 localhost podman[28465]: 2025-10-14 07:52:20.202124891 +0000 UTC m=+0.161273936 container died 5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_ganguly, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7) Oct 14 03:52:20 localhost podman[28486]: 2025-10-14 07:52:20.287310033 +0000 UTC m=+0.075940156 container remove 5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_ganguly, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, release=553, distribution-scope=public, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, build-date=2025-09-24T08:57:55, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 03:52:20 localhost systemd[1]: libpod-conmon-5000090e4a8fa353a91b2d65addb4f43585eae99f399a2d03ef419e6e3f1a5dc.scope: Deactivated successfully. Oct 14 03:52:20 localhost systemd[1]: Reloading. Oct 14 03:52:20 localhost systemd-rc-local-generator[28528]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:20 localhost systemd-sysv-generator[28533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:20 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:52:20 localhost systemd[1]: Reloading. Oct 14 03:52:20 localhost systemd-rc-local-generator[28565]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:20 localhost systemd-sysv-generator[28568]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:20 localhost systemd[1]: Reached target All Ceph clusters and services. Oct 14 03:52:20 localhost systemd[1]: Reloading. Oct 14 03:52:20 localhost systemd-rc-local-generator[28603]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:20 localhost systemd-sysv-generator[28607]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:21 localhost systemd[1]: Reached target Ceph cluster fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 03:52:21 localhost systemd[1]: Reloading. Oct 14 03:52:21 localhost systemd-sysv-generator[28646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:21 localhost systemd-rc-local-generator[28642]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:21 localhost systemd[1]: Reloading. Oct 14 03:52:21 localhost systemd-sysv-generator[28688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:21 localhost systemd-rc-local-generator[28683]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:21 localhost systemd[1]: Created slice Slice /system/ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 03:52:21 localhost systemd[1]: Reached target System Time Set. Oct 14 03:52:21 localhost systemd[1]: Reached target System Time Synchronized. Oct 14 03:52:21 localhost systemd[1]: Starting Ceph crash.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 03:52:21 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:52:21 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Oct 14 03:52:21 localhost podman[28746]: Oct 14 03:52:21 localhost podman[28746]: 2025-10-14 07:52:21.831958574 +0000 UTC m=+0.066329932 container create b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da5a5819fdec6461660a36c170b4260c40503d6062d559de925f4ea1caa661e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da5a5819fdec6461660a36c170b4260c40503d6062d559de925f4ea1caa661e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da5a5819fdec6461660a36c170b4260c40503d6062d559de925f4ea1caa661e/merged/etc/ceph/ceph.client.crash.np0005486733.keyring supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:21 localhost podman[28746]: 2025-10-14 07:52:21.901801732 +0000 UTC m=+0.136173040 container init b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553, GIT_CLEAN=True) Oct 14 03:52:21 localhost podman[28746]: 2025-10-14 07:52:21.805463684 +0000 UTC m=+0.039835032 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:21 localhost podman[28746]: 2025-10-14 07:52:21.9108446 +0000 UTC m=+0.145215908 container start b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Oct 14 03:52:21 localhost bash[28746]: b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 Oct 14 03:52:21 localhost systemd[1]: Started Ceph crash.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 03:52:21 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: INFO:ceph-crash:pinging cluster to exercise our key Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.074+0000 7f0db6f0b640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.074+0000 7f0db6f0b640 -1 AuthRegistry(0x7f0db0067c70) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.075+0000 7f0db6f0b640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.075+0000 7f0db6f0b640 -1 AuthRegistry(0x7f0db6f0a000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.081+0000 7f0db4c80640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.082+0000 7f0da7fff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.084+0000 7f0db5481640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: 2025-10-14T07:52:22.084+0000 7f0db6f0b640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: [errno 13] RADOS permission denied (error connecting to the cluster) Oct 14 03:52:22 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733[28760]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Oct 14 03:52:25 localhost podman[28846]: Oct 14 03:52:25 localhost podman[28846]: 2025-10-14 07:52:25.352771221 +0000 UTC m=+0.063805354 container create 190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_ride, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., release=553) Oct 14 03:52:25 localhost systemd[1]: Started libpod-conmon-190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d.scope. Oct 14 03:52:25 localhost systemd[1]: Started libcrun container. Oct 14 03:52:25 localhost podman[28846]: 2025-10-14 07:52:25.426074472 +0000 UTC m=+0.137108595 container init 190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_ride, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, maintainer=Guillaume Abrioux , ceph=True, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, name=rhceph, com.redhat.component=rhceph-container) Oct 14 03:52:25 localhost podman[28846]: 2025-10-14 07:52:25.330750981 +0000 UTC m=+0.041785074 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:25 localhost podman[28846]: 2025-10-14 07:52:25.436477554 +0000 UTC m=+0.147511677 container start 190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_ride, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, ceph=True, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container) Oct 14 03:52:25 localhost podman[28846]: 2025-10-14 07:52:25.438758601 +0000 UTC m=+0.149792755 container attach 190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_ride, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 03:52:25 localhost systemd[1]: libpod-190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d.scope: Deactivated successfully. Oct 14 03:52:25 localhost jovial_ride[28861]: 167 167 Oct 14 03:52:25 localhost podman[28846]: 2025-10-14 07:52:25.441675889 +0000 UTC m=+0.152710052 container died 190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_ride, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, distribution-scope=public, name=rhceph, release=553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Oct 14 03:52:25 localhost podman[28867]: 2025-10-14 07:52:25.521000524 +0000 UTC m=+0.066690150 container remove 190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_ride, ceph=True, io.openshift.tags=rhceph ceph, release=553, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7) Oct 14 03:52:25 localhost systemd[1]: libpod-conmon-190b83d888c6cf24bb5b9895e372f92fa3e657b3d9cc47734a20709b491c246d.scope: Deactivated successfully. Oct 14 03:52:25 localhost podman[28888]: Oct 14 03:52:25 localhost podman[28888]: 2025-10-14 07:52:25.70728587 +0000 UTC m=+0.050772878 container create 75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_noether, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, version=7, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, distribution-scope=public) Oct 14 03:52:25 localhost systemd[1]: Started libpod-conmon-75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8.scope. Oct 14 03:52:25 localhost systemd[1]: Started libcrun container. Oct 14 03:52:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bf53826edaa21ba0dcb5c420508a7d5b8345476078b989829383ba407ba0248/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bf53826edaa21ba0dcb5c420508a7d5b8345476078b989829383ba407ba0248/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:25 localhost podman[28888]: 2025-10-14 07:52:25.688331395 +0000 UTC m=+0.031818393 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bf53826edaa21ba0dcb5c420508a7d5b8345476078b989829383ba407ba0248/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bf53826edaa21ba0dcb5c420508a7d5b8345476078b989829383ba407ba0248/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bf53826edaa21ba0dcb5c420508a7d5b8345476078b989829383ba407ba0248/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:25 localhost podman[28888]: 2025-10-14 07:52:25.81276525 +0000 UTC m=+0.156252218 container init 75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_noether, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64) Oct 14 03:52:25 localhost podman[28888]: 2025-10-14 07:52:25.821880621 +0000 UTC m=+0.165367629 container start 75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_noether, name=rhceph, io.openshift.expose-services=, release=553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Oct 14 03:52:25 localhost podman[28888]: 2025-10-14 07:52:25.822215787 +0000 UTC m=+0.165702775 container attach 75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_noether, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 03:52:26 localhost unruffled_noether[28903]: --> passed data devices: 0 physical, 2 LVM Oct 14 03:52:26 localhost unruffled_noether[28903]: --> relative data size: 1.0 Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-authtool --gen-print-key Oct 14 03:52:26 localhost systemd[1]: var-lib-containers-storage-overlay-b815e8ef13ebef20d204dee6d9920f5dab2771fde4af115e93e600256ea5a086-merged.mount: Deactivated successfully. Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 671c314e-194c-4820-b83c-cca1cfcd5ad7 Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-authtool --gen-print-key Oct 14 03:52:26 localhost lvm[28957]: PV /dev/loop3 online, VG ceph_vg0 is complete. Oct 14 03:52:26 localhost lvm[28957]: VG ceph_vg0 finished Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0 Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:26 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap Oct 14 03:52:27 localhost unruffled_noether[28903]: stderr: got monmap epoch 3 Oct 14 03:52:27 localhost unruffled_noether[28903]: --> Creating keyring file for osd.0 Oct 14 03:52:27 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring Oct 14 03:52:27 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/ Oct 14 03:52:27 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 671c314e-194c-4820-b83c-cca1cfcd5ad7 --setuser ceph --setgroup ceph Oct 14 03:52:29 localhost unruffled_noether[28903]: stderr: 2025-10-14T07:52:27.431+0000 7ffba2925a80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Oct 14 03:52:29 localhost unruffled_noether[28903]: stderr: 2025-10-14T07:52:27.431+0000 7ffba2925a80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid Oct 14 03:52:29 localhost unruffled_noether[28903]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Oct 14 03:52:29 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Oct 14 03:52:29 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config Oct 14 03:52:29 localhost unruffled_noether[28903]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:29 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:29 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Oct 14 03:52:29 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Oct 14 03:52:29 localhost unruffled_noether[28903]: --> ceph-volume lvm activate successful for osd ID: 0 Oct 14 03:52:29 localhost unruffled_noether[28903]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-authtool --gen-print-key Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 65efa31d-b0f9-4ff9-86fe-a4dc8772e327 Oct 14 03:52:30 localhost lvm[29902]: PV /dev/loop4 online, VG ceph_vg1 is complete. Oct 14 03:52:30 localhost lvm[29902]: VG ceph_vg1 finished Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-authtool --gen-print-key Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3 Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:30 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap Oct 14 03:52:31 localhost unruffled_noether[28903]: stderr: got monmap epoch 3 Oct 14 03:52:31 localhost unruffled_noether[28903]: --> Creating keyring file for osd.3 Oct 14 03:52:31 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring Oct 14 03:52:31 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/ Oct 14 03:52:31 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 65efa31d-b0f9-4ff9-86fe-a4dc8772e327 --setuser ceph --setgroup ceph Oct 14 03:52:33 localhost unruffled_noether[28903]: stderr: 2025-10-14T07:52:31.183+0000 7f992c17da80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Oct 14 03:52:33 localhost unruffled_noether[28903]: stderr: 2025-10-14T07:52:31.183+0000 7f992c17da80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid Oct 14 03:52:33 localhost unruffled_noether[28903]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Oct 14 03:52:33 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Oct 14 03:52:33 localhost unruffled_noether[28903]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config Oct 14 03:52:33 localhost unruffled_noether[28903]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:33 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:33 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Oct 14 03:52:33 localhost unruffled_noether[28903]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Oct 14 03:52:33 localhost unruffled_noether[28903]: --> ceph-volume lvm activate successful for osd ID: 3 Oct 14 03:52:33 localhost unruffled_noether[28903]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Oct 14 03:52:33 localhost systemd[1]: libpod-75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8.scope: Deactivated successfully. Oct 14 03:52:33 localhost systemd[1]: libpod-75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8.scope: Consumed 3.527s CPU time. Oct 14 03:52:33 localhost podman[30817]: 2025-10-14 07:52:33.821354841 +0000 UTC m=+0.045288199 container died 75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_noether, GIT_CLEAN=True, release=553, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Oct 14 03:52:33 localhost systemd[1]: tmp-crun.exLYi9.mount: Deactivated successfully. Oct 14 03:52:33 localhost systemd[1]: var-lib-containers-storage-overlay-7bf53826edaa21ba0dcb5c420508a7d5b8345476078b989829383ba407ba0248-merged.mount: Deactivated successfully. Oct 14 03:52:33 localhost podman[30817]: 2025-10-14 07:52:33.865252184 +0000 UTC m=+0.089185502 container remove 75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_noether, ceph=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, version=7, release=553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Oct 14 03:52:33 localhost systemd[1]: libpod-conmon-75ff3c4d3837f49d2be073a55beb4a7ab9c3da4f43c28c728a6dc27dd2cb8da8.scope: Deactivated successfully. Oct 14 03:52:34 localhost podman[30900]: Oct 14 03:52:34 localhost podman[30900]: 2025-10-14 07:52:34.607009057 +0000 UTC m=+0.066737962 container create eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_agnesi, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, release=553, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:34 localhost systemd[1]: Started libpod-conmon-eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95.scope. Oct 14 03:52:34 localhost systemd[1]: Started libcrun container. Oct 14 03:52:34 localhost podman[30900]: 2025-10-14 07:52:34.66831086 +0000 UTC m=+0.128039765 container init eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_agnesi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.33.12, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container) Oct 14 03:52:34 localhost podman[30900]: 2025-10-14 07:52:34.577916133 +0000 UTC m=+0.037645048 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:34 localhost podman[30900]: 2025-10-14 07:52:34.67910836 +0000 UTC m=+0.138837265 container start eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_agnesi, RELEASE=main, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-09-24T08:57:55, name=rhceph) Oct 14 03:52:34 localhost podman[30900]: 2025-10-14 07:52:34.679392334 +0000 UTC m=+0.139121299 container attach eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_agnesi, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 03:52:34 localhost jolly_agnesi[30915]: 167 167 Oct 14 03:52:34 localhost podman[30900]: 2025-10-14 07:52:34.683434635 +0000 UTC m=+0.143163570 container died eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_agnesi, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git) Oct 14 03:52:34 localhost systemd[1]: libpod-eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95.scope: Deactivated successfully. Oct 14 03:52:34 localhost podman[30920]: 2025-10-14 07:52:34.770735087 +0000 UTC m=+0.077265130 container remove eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_agnesi, vendor=Red Hat, Inc., release=553, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , version=7, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 03:52:34 localhost systemd[1]: libpod-conmon-eec669f7c8b980bb938131e97afe632812c462704783dbd9df97f089a0037c95.scope: Deactivated successfully. Oct 14 03:52:34 localhost systemd[1]: var-lib-containers-storage-overlay-681c3da23ee6aecf3415e42fd76ba3ba836c29dfcf2e2e5bf55f2e7e50b9db01-merged.mount: Deactivated successfully. Oct 14 03:52:34 localhost podman[30940]: Oct 14 03:52:34 localhost podman[30940]: 2025-10-14 07:52:34.987999306 +0000 UTC m=+0.080779725 container create b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_bartik, version=7, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Oct 14 03:52:35 localhost systemd[1]: Started libpod-conmon-b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e.scope. Oct 14 03:52:35 localhost systemd[1]: Started libcrun container. Oct 14 03:52:35 localhost podman[30940]: 2025-10-14 07:52:34.95294389 +0000 UTC m=+0.045724379 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95d69fe011e443d7ff21e42fd2c452d032d52dd6ec0d9fc073d2451802e7d888/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95d69fe011e443d7ff21e42fd2c452d032d52dd6ec0d9fc073d2451802e7d888/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95d69fe011e443d7ff21e42fd2c452d032d52dd6ec0d9fc073d2451802e7d888/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:35 localhost podman[30940]: 2025-10-14 07:52:35.088843787 +0000 UTC m=+0.181624216 container init b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_bartik, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 03:52:35 localhost podman[30940]: 2025-10-14 07:52:35.100745409 +0000 UTC m=+0.193525808 container start b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_bartik, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, name=rhceph, ceph=True, version=7, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main) Oct 14 03:52:35 localhost podman[30940]: 2025-10-14 07:52:35.101137467 +0000 UTC m=+0.193917876 container attach b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_bartik, release=553, architecture=x86_64, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55) Oct 14 03:52:35 localhost confident_bartik[30955]: { Oct 14 03:52:35 localhost confident_bartik[30955]: "0": [ Oct 14 03:52:35 localhost confident_bartik[30955]: { Oct 14 03:52:35 localhost confident_bartik[30955]: "devices": [ Oct 14 03:52:35 localhost confident_bartik[30955]: "/dev/loop3" Oct 14 03:52:35 localhost confident_bartik[30955]: ], Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_name": "ceph_lv0", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_size": "7511998464", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=L4ZMtz-CV0v-r30C-L078-BQlQ-UhPu-QVR2nE,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=fcadf6e2-9176-5818-a8d0-37b19acf8eaf,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=671c314e-194c-4820-b83c-cca1cfcd5ad7,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_uuid": "L4ZMtz-CV0v-r30C-L078-BQlQ-UhPu-QVR2nE", Oct 14 03:52:35 localhost confident_bartik[30955]: "name": "ceph_lv0", Oct 14 03:52:35 localhost confident_bartik[30955]: "path": "/dev/ceph_vg0/ceph_lv0", Oct 14 03:52:35 localhost confident_bartik[30955]: "tags": { Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.block_uuid": "L4ZMtz-CV0v-r30C-L078-BQlQ-UhPu-QVR2nE", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.cephx_lockbox_secret": "", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.cluster_fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.cluster_name": "ceph", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.crush_device_class": "", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.encrypted": "0", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.osd_fsid": "671c314e-194c-4820-b83c-cca1cfcd5ad7", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.osd_id": "0", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.osdspec_affinity": "default_drive_group", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.type": "block", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.vdo": "0" Oct 14 03:52:35 localhost confident_bartik[30955]: }, Oct 14 03:52:35 localhost confident_bartik[30955]: "type": "block", Oct 14 03:52:35 localhost confident_bartik[30955]: "vg_name": "ceph_vg0" Oct 14 03:52:35 localhost confident_bartik[30955]: } Oct 14 03:52:35 localhost confident_bartik[30955]: ], Oct 14 03:52:35 localhost confident_bartik[30955]: "3": [ Oct 14 03:52:35 localhost confident_bartik[30955]: { Oct 14 03:52:35 localhost confident_bartik[30955]: "devices": [ Oct 14 03:52:35 localhost confident_bartik[30955]: "/dev/loop4" Oct 14 03:52:35 localhost confident_bartik[30955]: ], Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_name": "ceph_lv1", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_size": "7511998464", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=GlR1EB-eKdT-ET6h-MDuS-Q3Cu-14kG-Fy8xUC,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=fcadf6e2-9176-5818-a8d0-37b19acf8eaf,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=65efa31d-b0f9-4ff9-86fe-a4dc8772e327,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Oct 14 03:52:35 localhost confident_bartik[30955]: "lv_uuid": "GlR1EB-eKdT-ET6h-MDuS-Q3Cu-14kG-Fy8xUC", Oct 14 03:52:35 localhost confident_bartik[30955]: "name": "ceph_lv1", Oct 14 03:52:35 localhost confident_bartik[30955]: "path": "/dev/ceph_vg1/ceph_lv1", Oct 14 03:52:35 localhost confident_bartik[30955]: "tags": { Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.block_uuid": "GlR1EB-eKdT-ET6h-MDuS-Q3Cu-14kG-Fy8xUC", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.cephx_lockbox_secret": "", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.cluster_fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.cluster_name": "ceph", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.crush_device_class": "", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.encrypted": "0", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.osd_fsid": "65efa31d-b0f9-4ff9-86fe-a4dc8772e327", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.osd_id": "3", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.osdspec_affinity": "default_drive_group", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.type": "block", Oct 14 03:52:35 localhost confident_bartik[30955]: "ceph.vdo": "0" Oct 14 03:52:35 localhost confident_bartik[30955]: }, Oct 14 03:52:35 localhost confident_bartik[30955]: "type": "block", Oct 14 03:52:35 localhost confident_bartik[30955]: "vg_name": "ceph_vg1" Oct 14 03:52:35 localhost confident_bartik[30955]: } Oct 14 03:52:35 localhost confident_bartik[30955]: ] Oct 14 03:52:35 localhost confident_bartik[30955]: } Oct 14 03:52:35 localhost systemd[1]: libpod-b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e.scope: Deactivated successfully. Oct 14 03:52:35 localhost podman[30940]: 2025-10-14 07:52:35.455895948 +0000 UTC m=+0.548676327 container died b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_bartik, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55) Oct 14 03:52:35 localhost podman[30964]: 2025-10-14 07:52:35.571652163 +0000 UTC m=+0.107376501 container remove b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_bartik, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 03:52:35 localhost systemd[1]: libpod-conmon-b28ed94e5f2a969d9cc196cc1e7787059541d33f625bda8281c48de0ade0c38e.scope: Deactivated successfully. Oct 14 03:52:35 localhost systemd[1]: var-lib-containers-storage-overlay-95d69fe011e443d7ff21e42fd2c452d032d52dd6ec0d9fc073d2451802e7d888-merged.mount: Deactivated successfully. Oct 14 03:52:36 localhost podman[31051]: Oct 14 03:52:36 localhost podman[31051]: 2025-10-14 07:52:36.387963325 +0000 UTC m=+0.080782605 container create 69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_murdock, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:36 localhost systemd[1]: Started libpod-conmon-69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691.scope. Oct 14 03:52:36 localhost systemd[1]: Started libcrun container. Oct 14 03:52:36 localhost podman[31051]: 2025-10-14 07:52:36.35650902 +0000 UTC m=+0.049328330 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:36 localhost podman[31051]: 2025-10-14 07:52:36.468239196 +0000 UTC m=+0.161058476 container init 69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_murdock, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55) Oct 14 03:52:36 localhost podman[31051]: 2025-10-14 07:52:36.479534889 +0000 UTC m=+0.172354159 container start 69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_murdock, release=553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Oct 14 03:52:36 localhost podman[31051]: 2025-10-14 07:52:36.479879565 +0000 UTC m=+0.172698835 container attach 69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_murdock, name=rhceph, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Oct 14 03:52:36 localhost adoring_murdock[31066]: 167 167 Oct 14 03:52:36 localhost systemd[1]: libpod-69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691.scope: Deactivated successfully. Oct 14 03:52:36 localhost podman[31051]: 2025-10-14 07:52:36.485680529 +0000 UTC m=+0.178499829 container died 69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_murdock, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, release=553, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 03:52:36 localhost podman[31071]: 2025-10-14 07:52:36.558554 +0000 UTC m=+0.065373848 container remove 69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_murdock, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Oct 14 03:52:36 localhost systemd[1]: libpod-conmon-69e9f3aa41e3b13390522db6055ed18d962f4608560ed9e0ecbadd3311678691.scope: Deactivated successfully. Oct 14 03:52:36 localhost systemd[1]: tmp-crun.i3B5O7.mount: Deactivated successfully. Oct 14 03:52:36 localhost systemd[1]: var-lib-containers-storage-overlay-f2da3536bf181d662609d47872db4b3b0cc1b3fb92b5bc46786f7d4b15a477ab-merged.mount: Deactivated successfully. Oct 14 03:52:36 localhost podman[31099]: Oct 14 03:52:36 localhost podman[31099]: 2025-10-14 07:52:36.861547746 +0000 UTC m=+0.070760112 container create b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, version=7, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Oct 14 03:52:36 localhost systemd[1]: Started libpod-conmon-b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2.scope. Oct 14 03:52:36 localhost systemd[1]: Started libcrun container. Oct 14 03:52:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad8886f618ec2cf62729b685eaa745bb6092e5d5684164887ad27c1b81fd4281/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:36 localhost podman[31099]: 2025-10-14 07:52:36.832661852 +0000 UTC m=+0.041874278 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad8886f618ec2cf62729b685eaa745bb6092e5d5684164887ad27c1b81fd4281/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad8886f618ec2cf62729b685eaa745bb6092e5d5684164887ad27c1b81fd4281/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad8886f618ec2cf62729b685eaa745bb6092e5d5684164887ad27c1b81fd4281/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad8886f618ec2cf62729b685eaa745bb6092e5d5684164887ad27c1b81fd4281/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:36 localhost podman[31099]: 2025-10-14 07:52:36.971229834 +0000 UTC m=+0.180442190 container init b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test, RELEASE=main, release=553, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 03:52:36 localhost podman[31099]: 2025-10-14 07:52:36.981599274 +0000 UTC m=+0.190811640 container start b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, release=553, architecture=x86_64, name=rhceph, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Oct 14 03:52:36 localhost podman[31099]: 2025-10-14 07:52:36.982095967 +0000 UTC m=+0.191308373 container attach b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container) Oct 14 03:52:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test[31114]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Oct 14 03:52:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test[31114]: [--no-systemd] [--no-tmpfs] Oct 14 03:52:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test[31114]: ceph-volume activate: error: unrecognized arguments: --bad-option Oct 14 03:52:37 localhost systemd[1]: libpod-b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2.scope: Deactivated successfully. Oct 14 03:52:37 localhost podman[31099]: 2025-10-14 07:52:37.206589287 +0000 UTC m=+0.415801683 container died b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:37 localhost podman[31119]: 2025-10-14 07:52:37.27909437 +0000 UTC m=+0.065271892 container remove b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate-test, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main) Oct 14 03:52:37 localhost systemd-journald[618]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Oct 14 03:52:37 localhost systemd-journald[618]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 03:52:37 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 03:52:37 localhost systemd[1]: libpod-conmon-b8f9d49a4410fedb77e9139bafb9a1006faa78e8ee6a373c292045f7e1a4dae2.scope: Deactivated successfully. Oct 14 03:52:37 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 03:52:37 localhost systemd[1]: Reloading. Oct 14 03:52:37 localhost systemd-sysv-generator[31175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:37 localhost systemd-rc-local-generator[31172]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:37 localhost systemd[1]: var-lib-containers-storage-overlay-ad8886f618ec2cf62729b685eaa745bb6092e5d5684164887ad27c1b81fd4281-merged.mount: Deactivated successfully. Oct 14 03:52:37 localhost systemd[1]: Reloading. Oct 14 03:52:37 localhost systemd-rc-local-generator[31209]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:37 localhost systemd-sysv-generator[31215]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:38 localhost systemd[1]: Starting Ceph osd.0 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 03:52:38 localhost podman[31282]: Oct 14 03:52:38 localhost podman[31282]: 2025-10-14 07:52:38.386413184 +0000 UTC m=+0.073013878 container create 6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7) Oct 14 03:52:38 localhost systemd[1]: Started libcrun container. Oct 14 03:52:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56bb065678c27143a9f3b929f9f11509f8891ea66e441b3758cf4d7843fbdeb5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:38 localhost podman[31282]: 2025-10-14 07:52:38.356875789 +0000 UTC m=+0.043476553 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56bb065678c27143a9f3b929f9f11509f8891ea66e441b3758cf4d7843fbdeb5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56bb065678c27143a9f3b929f9f11509f8891ea66e441b3758cf4d7843fbdeb5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56bb065678c27143a9f3b929f9f11509f8891ea66e441b3758cf4d7843fbdeb5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56bb065678c27143a9f3b929f9f11509f8891ea66e441b3758cf4d7843fbdeb5/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:38 localhost podman[31282]: 2025-10-14 07:52:38.511247218 +0000 UTC m=+0.197847932 container init 6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , architecture=x86_64, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, ceph=True, build-date=2025-09-24T08:57:55, name=rhceph, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 03:52:38 localhost podman[31282]: 2025-10-14 07:52:38.52059869 +0000 UTC m=+0.207199394 container start 6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate, GIT_CLEAN=True, version=7, io.buildah.version=1.33.12, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:38 localhost podman[31282]: 2025-10-14 07:52:38.520879983 +0000 UTC m=+0.207480757 container attach 6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=553, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , name=rhceph, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-type=git) Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Oct 14 03:52:39 localhost bash[31282]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Oct 14 03:52:39 localhost bash[31282]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Oct 14 03:52:39 localhost bash[31282]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Oct 14 03:52:39 localhost bash[31282]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:39 localhost bash[31282]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Oct 14 03:52:39 localhost bash[31282]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Oct 14 03:52:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate[31297]: --> ceph-volume raw activate successful for osd ID: 0 Oct 14 03:52:39 localhost bash[31282]: --> ceph-volume raw activate successful for osd ID: 0 Oct 14 03:52:39 localhost systemd[1]: libpod-6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12.scope: Deactivated successfully. Oct 14 03:52:39 localhost podman[31422]: 2025-10-14 07:52:39.321009161 +0000 UTC m=+0.052740401 container died 6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Oct 14 03:52:39 localhost systemd[1]: var-lib-containers-storage-overlay-56bb065678c27143a9f3b929f9f11509f8891ea66e441b3758cf4d7843fbdeb5-merged.mount: Deactivated successfully. Oct 14 03:52:39 localhost podman[31422]: 2025-10-14 07:52:39.361340096 +0000 UTC m=+0.093071306 container remove 6fd9bc4c53e25f9a3f543a9e16cbaa16983e6f6c7133753e0cb6d227bc404b12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0-activate, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux ) Oct 14 03:52:39 localhost podman[31481]: Oct 14 03:52:39 localhost podman[31481]: 2025-10-14 07:52:39.651288946 +0000 UTC m=+0.057803270 container create 3ea3299b616e6be7c3f1fc07edfa13deb6ae4569f84e02dba62c43b301102f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, RELEASE=main) Oct 14 03:52:39 localhost systemd[1]: tmp-crun.JBonqA.mount: Deactivated successfully. Oct 14 03:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45c951e2d3741128403aa846ae0490f058938549be08d7d5319b52044e315772/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45c951e2d3741128403aa846ae0490f058938549be08d7d5319b52044e315772/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45c951e2d3741128403aa846ae0490f058938549be08d7d5319b52044e315772/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:39 localhost podman[31481]: 2025-10-14 07:52:39.632343661 +0000 UTC m=+0.038857985 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45c951e2d3741128403aa846ae0490f058938549be08d7d5319b52044e315772/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45c951e2d3741128403aa846ae0490f058938549be08d7d5319b52044e315772/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:39 localhost podman[31481]: 2025-10-14 07:52:39.749356036 +0000 UTC m=+0.155870380 container init 3ea3299b616e6be7c3f1fc07edfa13deb6ae4569f84e02dba62c43b301102f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0, architecture=x86_64, RELEASE=main, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 03:52:39 localhost podman[31481]: 2025-10-14 07:52:39.757970443 +0000 UTC m=+0.164484787 container start 3ea3299b616e6be7c3f1fc07edfa13deb6ae4569f84e02dba62c43b301102f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, release=553, CEPH_POINT_RELEASE=, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Oct 14 03:52:39 localhost bash[31481]: 3ea3299b616e6be7c3f1fc07edfa13deb6ae4569f84e02dba62c43b301102f2d Oct 14 03:52:39 localhost systemd[1]: Started Ceph osd.0 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 03:52:39 localhost ceph-osd[31500]: set uid:gid to 167:167 (ceph:ceph) Oct 14 03:52:39 localhost ceph-osd[31500]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Oct 14 03:52:39 localhost ceph-osd[31500]: pidfile_write: ignore empty --pid-file Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:39 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:39 localhost ceph-osd[31500]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Oct 14 03:52:39 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) close Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) close Oct 14 03:52:40 localhost ceph-osd[31500]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal Oct 14 03:52:40 localhost ceph-osd[31500]: load: jerasure load: lrc Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) close Oct 14 03:52:40 localhost podman[31591]: Oct 14 03:52:40 localhost podman[31591]: 2025-10-14 07:52:40.636779006 +0000 UTC m=+0.061466873 container create 6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_margulis, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, name=rhceph, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) close Oct 14 03:52:40 localhost systemd[1]: Started libpod-conmon-6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16.scope. Oct 14 03:52:40 localhost systemd[1]: Started libcrun container. Oct 14 03:52:40 localhost podman[31591]: 2025-10-14 07:52:40.707943226 +0000 UTC m=+0.132631103 container init 6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_margulis, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container) Oct 14 03:52:40 localhost podman[31591]: 2025-10-14 07:52:40.614905063 +0000 UTC m=+0.039592950 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:40 localhost podman[31591]: 2025-10-14 07:52:40.722979097 +0000 UTC m=+0.147667004 container start 6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_margulis, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:40 localhost podman[31591]: 2025-10-14 07:52:40.723388286 +0000 UTC m=+0.148076153 container attach 6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_margulis, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, version=7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 03:52:40 localhost condescending_margulis[31610]: 167 167 Oct 14 03:52:40 localhost systemd[1]: libpod-6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16.scope: Deactivated successfully. Oct 14 03:52:40 localhost podman[31591]: 2025-10-14 07:52:40.731453997 +0000 UTC m=+0.156141864 container died 6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_margulis, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, release=553) Oct 14 03:52:40 localhost podman[31615]: 2025-10-14 07:52:40.842444017 +0000 UTC m=+0.095274840 container remove 6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_margulis, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, distribution-scope=public, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc.) Oct 14 03:52:40 localhost systemd[1]: libpod-conmon-6897cfd2f01d548aed3a1b465ed90085a83ac8c369d226d6d4028a4347d62d16.scope: Deactivated successfully. Oct 14 03:52:40 localhost ceph-osd[31500]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Oct 14 03:52:40 localhost ceph-osd[31500]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:40 localhost ceph-osd[31500]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Oct 14 03:52:40 localhost ceph-osd[31500]: bluefs mount Oct 14 03:52:40 localhost ceph-osd[31500]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Oct 14 03:52:40 localhost ceph-osd[31500]: bluefs mount shared_bdev_used = 0 Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: RocksDB version: 7.9.2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Git sha 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Compile date 2025-09-23 00:00:00 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: DB SUMMARY Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: DB Session ID: K6O7FDU8XXZ2Q8LN71S9 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: CURRENT file: CURRENT Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: IDENTITY file: IDENTITY Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.error_if_exists: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.create_if_missing: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.flush_verify_memtable_count: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.env: 0x5613efe62cb0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.fs: LegacyFileSystem Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.info_log: 0x5613f0b56380 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_file_opening_threads: 16 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.statistics: (nil) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.use_fsync: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_log_file_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_manifest_file_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.log_file_time_to_roll: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.keep_log_file_num: 1000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.recycle_log_file_num: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.allow_fallocate: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.allow_mmap_reads: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.allow_mmap_writes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.use_direct_reads: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.create_missing_column_families: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.db_log_dir: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.wal_dir: db.wal Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_cache_numshardbits: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.WAL_ttl_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.WAL_size_limit_MB: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.manifest_preallocation_size: 4194304 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.is_fd_close_on_exec: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.advise_random_on_open: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.db_write_buffer_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_manager: 0x5613efbb8140 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.access_hint_on_compaction_start: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.random_access_max_buffer_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.use_adaptive_mutex: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.rate_limiter: (nil) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.wal_recovery_mode: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_thread_tracking: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_pipelined_write: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.unordered_write: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.allow_concurrent_memtable_write: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_thread_max_yield_usec: 100 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_thread_slow_yield_usec: 3 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.row_cache: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.wal_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.avoid_flush_during_recovery: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.allow_ingest_behind: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.two_write_queues: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.manual_wal_flush: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.wal_compression: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.atomic_flush: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.persist_stats_to_disk: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_dbid_to_manifest: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.log_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.file_checksum_gen_factory: Unknown Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.best_efforts_recovery: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.allow_data_in_errors: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.db_host_id: __hostname__ Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enforce_single_del_contracts: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_background_jobs: 4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_background_compactions: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_subcompactions: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.avoid_flush_during_shutdown: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.writable_file_max_buffer_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.delayed_write_rate : 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_total_wal_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.stats_dump_period_sec: 600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.stats_persist_period_sec: 600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.stats_history_buffer_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_open_files: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bytes_per_sync: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.wal_bytes_per_sync: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.strict_bytes_per_sync: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_readahead_size: 2097152 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_background_flushes: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Compression algorithms supported: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kZSTD supported: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kXpressCompression supported: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kBZip2Compression supported: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kLZ4Compression supported: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kZlibCompression supported: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kLZ4HCCompression supported: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: #011kSnappyCompression supported: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Fast CRC32 supported: Supported on x86 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: DMutex implementation: pthread_mutex_t Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56540)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0b56760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3ff74b14-08ee-4d42-a06d-2b04cbb35a5d Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428360959218, "job": 1, "event": "recovery_started", "wal_files": [31]} Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428360959653, "job": 1, "event": "recovery_finished"} Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025 Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240 Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000 Oct 14 03:52:40 localhost ceph-osd[31500]: freelist init Oct 14 03:52:40 localhost ceph-osd[31500]: freelist _read_cfg Oct 14 03:52:40 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Oct 14 03:52:40 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Oct 14 03:52:40 localhost ceph-osd[31500]: bluefs umount Oct 14 03:52:40 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) close Oct 14 03:52:41 localhost podman[31837]: Oct 14 03:52:41 localhost podman[31837]: 2025-10-14 07:52:41.170811231 +0000 UTC m=+0.062435599 container create 27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, release=553, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 03:52:41 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Oct 14 03:52:41 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Oct 14 03:52:41 localhost ceph-osd[31500]: bdev(0x5613efbcf180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:41 localhost ceph-osd[31500]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Oct 14 03:52:41 localhost ceph-osd[31500]: bluefs mount Oct 14 03:52:41 localhost ceph-osd[31500]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Oct 14 03:52:41 localhost ceph-osd[31500]: bluefs mount shared_bdev_used = 4718592 Oct 14 03:52:41 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: RocksDB version: 7.9.2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Git sha 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Compile date 2025-09-23 00:00:00 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: DB SUMMARY Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: DB Session ID: K6O7FDU8XXZ2Q8LN71S8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: CURRENT file: CURRENT Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: IDENTITY file: IDENTITY Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.error_if_exists: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.create_if_missing: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.flush_verify_memtable_count: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.env: 0x5613efe63e30 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.fs: LegacyFileSystem Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.info_log: 0x5613f0b56ea0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_file_opening_threads: 16 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.statistics: (nil) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.use_fsync: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_log_file_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_manifest_file_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.log_file_time_to_roll: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.keep_log_file_num: 1000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.recycle_log_file_num: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.allow_fallocate: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.allow_mmap_reads: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.allow_mmap_writes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.use_direct_reads: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.create_missing_column_families: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.db_log_dir: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.wal_dir: db.wal Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_cache_numshardbits: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.WAL_ttl_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.WAL_size_limit_MB: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.manifest_preallocation_size: 4194304 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.is_fd_close_on_exec: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.advise_random_on_open: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.db_write_buffer_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_manager: 0x5613efbb9540 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.access_hint_on_compaction_start: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.random_access_max_buffer_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.use_adaptive_mutex: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.rate_limiter: (nil) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.wal_recovery_mode: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_thread_tracking: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_pipelined_write: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.unordered_write: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.allow_concurrent_memtable_write: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_thread_max_yield_usec: 100 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_thread_slow_yield_usec: 3 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.row_cache: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.wal_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.avoid_flush_during_recovery: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.allow_ingest_behind: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.two_write_queues: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.manual_wal_flush: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.wal_compression: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.atomic_flush: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.persist_stats_to_disk: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_dbid_to_manifest: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.log_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.file_checksum_gen_factory: Unknown Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.best_efforts_recovery: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.allow_data_in_errors: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.db_host_id: __hostname__ Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enforce_single_del_contracts: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_background_jobs: 4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_background_compactions: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_subcompactions: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.avoid_flush_during_shutdown: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.writable_file_max_buffer_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.delayed_write_rate : 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_total_wal_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.stats_dump_period_sec: 600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.stats_persist_period_sec: 600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.stats_history_buffer_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_open_files: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bytes_per_sync: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.wal_bytes_per_sync: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.strict_bytes_per_sync: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_readahead_size: 2097152 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_background_flushes: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Compression algorithms supported: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kZSTD supported: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kXpressCompression supported: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kBZip2Compression supported: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kLZ4Compression supported: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kZlibCompression supported: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kLZ4HCCompression supported: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: #011kSnappyCompression supported: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Fast CRC32 supported: Supported on x86 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: DMutex implementation: pthread_mutex_t Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf0da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf1000)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf1000)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.merge_operator: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5613f0bf1000)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5613efba62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression: LZ4 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.num_levels: 7 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:41 localhost systemd[1]: Started libpod-conmon-27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053.scope. Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3ff74b14-08ee-4d42-a06d-2b04cbb35a5d Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428361216182, "job": 1, "event": "recovery_started", "wal_files": [31]} Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428361222960, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760428361, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3ff74b14-08ee-4d42-a06d-2b04cbb35a5d", "db_session_id": "K6O7FDU8XXZ2Q8LN71S8", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428361226509, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760428361, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3ff74b14-08ee-4d42-a06d-2b04cbb35a5d", "db_session_id": "K6O7FDU8XXZ2Q8LN71S8", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428361230235, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760428361, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3ff74b14-08ee-4d42-a06d-2b04cbb35a5d", "db_session_id": "K6O7FDU8XXZ2Q8LN71S8", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428361235159, "job": 1, "event": "recovery_finished"} Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Oct 14 03:52:41 localhost podman[31837]: 2025-10-14 07:52:41.147417426 +0000 UTC m=+0.039041784 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:41 localhost systemd[1]: Started libcrun container. Oct 14 03:52:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6826aa28b5c0da8421eb423e8c1f9137e3ca856c6925711527298ef1fedabf8e/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5613efbcf500 Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: DB pointer 0x5613f0aada00 Oct 14 03:52:41 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Oct 14 03:52:41 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4 Oct 14 03:52:41 localhost ceph-osd[31500]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 03:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 460.80 MB usag Oct 14 03:52:41 localhost ceph-osd[31500]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Oct 14 03:52:41 localhost ceph-osd[31500]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Oct 14 03:52:41 localhost ceph-osd[31500]: _get_class not permitted to load lua Oct 14 03:52:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6826aa28b5c0da8421eb423e8c1f9137e3ca856c6925711527298ef1fedabf8e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:41 localhost ceph-osd[31500]: _get_class not permitted to load sdk Oct 14 03:52:41 localhost ceph-osd[31500]: _get_class not permitted to load test_remote_reads Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 load_pgs Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 load_pgs opened 0 pgs Oct 14 03:52:41 localhost ceph-osd[31500]: osd.0 0 log_to_monitors true Oct 14 03:52:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0[31496]: 2025-10-14T07:52:41.281+0000 7f87fbba8a80 -1 osd.0 0 log_to_monitors true Oct 14 03:52:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6826aa28b5c0da8421eb423e8c1f9137e3ca856c6925711527298ef1fedabf8e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6826aa28b5c0da8421eb423e8c1f9137e3ca856c6925711527298ef1fedabf8e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6826aa28b5c0da8421eb423e8c1f9137e3ca856c6925711527298ef1fedabf8e/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:41 localhost podman[31837]: 2025-10-14 07:52:41.320197965 +0000 UTC m=+0.211822333 container init 27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public) Oct 14 03:52:41 localhost podman[31837]: 2025-10-14 07:52:41.332202101 +0000 UTC m=+0.223826449 container start 27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55) Oct 14 03:52:41 localhost podman[31837]: 2025-10-14 07:52:41.332407711 +0000 UTC m=+0.224032059 container attach 27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test, name=rhceph, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=553) Oct 14 03:52:41 localhost systemd[1]: tmp-crun.246aQm.mount: Deactivated successfully. Oct 14 03:52:41 localhost systemd[1]: var-lib-containers-storage-overlay-7dd6466645146f61130fc3e3c4d002a32e03b13ca812269d3b5394eff62ad71d-merged.mount: Deactivated successfully. Oct 14 03:52:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test[32033]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Oct 14 03:52:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test[32033]: [--no-systemd] [--no-tmpfs] Oct 14 03:52:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test[32033]: ceph-volume activate: error: unrecognized arguments: --bad-option Oct 14 03:52:41 localhost systemd[1]: libpod-27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053.scope: Deactivated successfully. Oct 14 03:52:41 localhost podman[31837]: 2025-10-14 07:52:41.578756752 +0000 UTC m=+0.470381140 container died 27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=553, name=rhceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, vcs-type=git) Oct 14 03:52:41 localhost systemd[1]: tmp-crun.tMhTEC.mount: Deactivated successfully. Oct 14 03:52:41 localhost systemd[1]: var-lib-containers-storage-overlay-6826aa28b5c0da8421eb423e8c1f9137e3ca856c6925711527298ef1fedabf8e-merged.mount: Deactivated successfully. Oct 14 03:52:41 localhost podman[32072]: 2025-10-14 07:52:41.686905479 +0000 UTC m=+0.090933005 container remove 27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate-test, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, vendor=Red Hat, Inc., release=553, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=) Oct 14 03:52:41 localhost systemd[1]: libpod-conmon-27f7e53ae699b1685ec7d3602ca070c4097db3359b21686401832e7973429053.scope: Deactivated successfully. Oct 14 03:52:41 localhost systemd[1]: Reloading. Oct 14 03:52:42 localhost systemd-sysv-generator[32127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:42 localhost systemd-rc-local-generator[32120]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:42 localhost systemd[1]: Reloading. Oct 14 03:52:42 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Oct 14 03:52:42 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Oct 14 03:52:42 localhost systemd-sysv-generator[32171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:52:42 localhost systemd-rc-local-generator[32167]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:52:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:52:42 localhost systemd[1]: Starting Ceph osd.3 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 03:52:42 localhost podman[32231]: Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 done with init, starting boot process Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 start_boot Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1 Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Oct 14 03:52:42 localhost ceph-osd[31500]: osd.0 0 bench count 12288000 bsize 4 KiB Oct 14 03:52:42 localhost podman[32231]: 2025-10-14 07:52:42.937958818 +0000 UTC m=+0.085738859 container create 3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, release=553, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12) Oct 14 03:52:42 localhost systemd[1]: tmp-crun.jn7fj9.mount: Deactivated successfully. Oct 14 03:52:42 localhost podman[32231]: 2025-10-14 07:52:42.896938332 +0000 UTC m=+0.044718303 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:43 localhost systemd[1]: Started libcrun container. Oct 14 03:52:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d060e405949df596d076d1afef86cf549dbeba83de9027b341a6132916363428/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d060e405949df596d076d1afef86cf549dbeba83de9027b341a6132916363428/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d060e405949df596d076d1afef86cf549dbeba83de9027b341a6132916363428/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d060e405949df596d076d1afef86cf549dbeba83de9027b341a6132916363428/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d060e405949df596d076d1afef86cf549dbeba83de9027b341a6132916363428/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:43 localhost podman[32231]: 2025-10-14 07:52:43.087846305 +0000 UTC m=+0.235626236 container init 3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main) Oct 14 03:52:43 localhost podman[32231]: 2025-10-14 07:52:43.103887443 +0000 UTC m=+0.251667384 container start 3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public) Oct 14 03:52:43 localhost podman[32231]: 2025-10-14 07:52:43.104197958 +0000 UTC m=+0.251977899 container attach 3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7) Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Oct 14 03:52:43 localhost bash[32231]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Oct 14 03:52:43 localhost bash[32231]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Oct 14 03:52:43 localhost bash[32231]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Oct 14 03:52:43 localhost bash[32231]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:43 localhost bash[32231]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Oct 14 03:52:43 localhost bash[32231]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Oct 14 03:52:43 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate[32244]: --> ceph-volume raw activate successful for osd ID: 3 Oct 14 03:52:43 localhost bash[32231]: --> ceph-volume raw activate successful for osd ID: 3 Oct 14 03:52:43 localhost systemd[1]: libpod-3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa.scope: Deactivated successfully. Oct 14 03:52:43 localhost podman[32231]: 2025-10-14 07:52:43.718350875 +0000 UTC m=+0.866130826 container died 3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, version=7, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public) Oct 14 03:52:43 localhost podman[32360]: 2025-10-14 07:52:43.805841606 +0000 UTC m=+0.081242077 container remove 3249896d4368fe76040ec8ce70cc6af42b99111f4e95da4406693dd40611a6aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3-activate, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 03:52:43 localhost systemd[1]: var-lib-containers-storage-overlay-d060e405949df596d076d1afef86cf549dbeba83de9027b341a6132916363428-merged.mount: Deactivated successfully. Oct 14 03:52:44 localhost podman[32421]: Oct 14 03:52:44 localhost podman[32421]: 2025-10-14 07:52:44.104757399 +0000 UTC m=+0.074470087 container create bf42a7d6bc58c4808aa1988e9a0505f673795d0e0367c21c5b6bbe76769171d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 03:52:44 localhost systemd[1]: tmp-crun.Ij7gck.mount: Deactivated successfully. Oct 14 03:52:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56f5b1ceea5f0be96ba86f2e5c88425b562da6df5e2fcc979708e795e017e00a/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:44 localhost podman[32421]: 2025-10-14 07:52:44.075819113 +0000 UTC m=+0.045531721 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56f5b1ceea5f0be96ba86f2e5c88425b562da6df5e2fcc979708e795e017e00a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56f5b1ceea5f0be96ba86f2e5c88425b562da6df5e2fcc979708e795e017e00a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56f5b1ceea5f0be96ba86f2e5c88425b562da6df5e2fcc979708e795e017e00a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56f5b1ceea5f0be96ba86f2e5c88425b562da6df5e2fcc979708e795e017e00a/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:44 localhost podman[32421]: 2025-10-14 07:52:44.256914183 +0000 UTC m=+0.226626821 container init bf42a7d6bc58c4808aa1988e9a0505f673795d0e0367c21c5b6bbe76769171d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, vcs-type=git) Oct 14 03:52:44 localhost podman[32421]: 2025-10-14 07:52:44.262952768 +0000 UTC m=+0.232665406 container start bf42a7d6bc58c4808aa1988e9a0505f673795d0e0367c21c5b6bbe76769171d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, distribution-scope=public, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553) Oct 14 03:52:44 localhost bash[32421]: bf42a7d6bc58c4808aa1988e9a0505f673795d0e0367c21c5b6bbe76769171d8 Oct 14 03:52:44 localhost systemd[1]: Started Ceph osd.3 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 03:52:44 localhost ceph-osd[32440]: set uid:gid to 167:167 (ceph:ceph) Oct 14 03:52:44 localhost ceph-osd[32440]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Oct 14 03:52:44 localhost ceph-osd[32440]: pidfile_write: ignore empty --pid-file Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:44 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:44 localhost ceph-osd[32440]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) close Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) close Oct 14 03:52:44 localhost ceph-osd[32440]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal Oct 14 03:52:44 localhost ceph-osd[32440]: load: jerasure load: lrc Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:44 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) close Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:44 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:44 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) close Oct 14 03:52:45 localhost podman[32534]: Oct 14 03:52:45 localhost podman[32534]: 2025-10-14 07:52:45.085082276 +0000 UTC m=+0.050977877 container create c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_shtern, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 03:52:45 localhost systemd[1]: Started libpod-conmon-c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a.scope. Oct 14 03:52:45 localhost systemd[1]: Started libcrun container. Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 30.147 iops: 7717.726 elapsed_sec: 0.389 Oct 14 03:52:45 localhost ceph-osd[31500]: log_channel(cluster) log [WRN] : OSD bench result of 7717.726261 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 0 waiting for initial osdmap Oct 14 03:52:45 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0[31496]: 2025-10-14T07:52:45.151+0000 7f87f833c640 -1 osd.0 0 waiting for initial osdmap Oct 14 03:52:45 localhost ceph-osd[32440]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2324e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs mount Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs mount shared_bdev_used = 0 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Oct 14 03:52:45 localhost podman[32534]: 2025-10-14 07:52:45.064149777 +0000 UTC m=+0.030045408 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 crush map has features 288514050185494528, adjusting msgr requires for clients Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 crush map has features 3314932999778484224, adjusting msgr requires for osds Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 check_osdmap_features require_osd_release unknown -> reef Oct 14 03:52:45 localhost podman[32534]: 2025-10-14 07:52:45.165912103 +0000 UTC m=+0.131807744 container init c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_shtern, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: RocksDB version: 7.9.2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Git sha 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Compile date 2025-09-23 00:00:00 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DB SUMMARY Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DB Session ID: ZK2S4ZHXI3MPRH2BPQT8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: CURRENT file: CURRENT Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: IDENTITY file: IDENTITY Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.error_if_exists: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.create_if_missing: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.flush_verify_memtable_count: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.env: 0x55b0d25b8c40 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.fs: LegacyFileSystem Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.info_log: 0x55b0d32a44c0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_file_opening_threads: 16 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.statistics: (nil) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_fsync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_log_file_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_manifest_file_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.log_file_time_to_roll: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.keep_log_file_num: 1000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.recycle_log_file_num: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_fallocate: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_mmap_reads: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_mmap_writes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_direct_reads: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.create_missing_column_families: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.db_log_dir: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_dir: db.wal Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_cache_numshardbits: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.WAL_ttl_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.WAL_size_limit_MB: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.manifest_preallocation_size: 4194304 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.is_fd_close_on_exec: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.advise_random_on_open: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.db_write_buffer_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_manager: 0x55b0d230e140 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.access_hint_on_compaction_start: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.random_access_max_buffer_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_adaptive_mutex: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.rate_limiter: (nil) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_recovery_mode: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_thread_tracking: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_pipelined_write: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.unordered_write: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_concurrent_memtable_write: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_thread_max_yield_usec: 100 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_thread_slow_yield_usec: 3 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.row_cache: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.avoid_flush_during_recovery: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_ingest_behind: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.two_write_queues: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.manual_wal_flush: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_compression: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.atomic_flush: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.persist_stats_to_disk: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_dbid_to_manifest: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.log_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.file_checksum_gen_factory: Unknown Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.best_efforts_recovery: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_data_in_errors: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.db_host_id: __hostname__ Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enforce_single_del_contracts: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_background_jobs: 4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_background_compactions: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_subcompactions: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.avoid_flush_during_shutdown: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.writable_file_max_buffer_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.delayed_write_rate : 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_total_wal_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.stats_dump_period_sec: 600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.stats_persist_period_sec: 600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.stats_history_buffer_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_open_files: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bytes_per_sync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_bytes_per_sync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.strict_bytes_per_sync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_readahead_size: 2097152 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_background_flushes: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Compression algorithms supported: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kZSTD supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kXpressCompression supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kBZip2Compression supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kLZ4Compression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kZlibCompression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kLZ4HCCompression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kSnappyCompression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Fast CRC32 supported: Supported on x86 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DMutex implementation: pthread_mutex_t Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 set_numa_affinity not setting numa affinity Oct 14 03:52:45 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-0[31496]: 2025-10-14T07:52:45.175+0000 7f87f3151640 -1 osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Oct 14 03:52:45 localhost ceph-osd[31500]: osd.0 10 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost podman[32534]: 2025-10-14 07:52:45.181207325 +0000 UTC m=+0.147102926 container start c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_shtern, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost podman[32534]: 2025-10-14 07:52:45.181445486 +0000 UTC m=+0.147341167 container attach c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_shtern, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, version=7) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost xenodochial_shtern[32548]: 167 167 Oct 14 03:52:45 localhost systemd[1]: libpod-c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a.scope: Deactivated successfully. Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a4680)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost podman[32534]: 2025-10-14 07:52:45.185725078 +0000 UTC m=+0.151620669 container died c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_shtern, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a48a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a48a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d32a48a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2f47c283-c2dc-4450-bf01-44670425f63a Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365190366, "job": 1, "event": "recovery_started", "wal_files": [31]} Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365190846, "job": 1, "event": "recovery_finished"} Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000 Oct 14 03:52:45 localhost ceph-osd[32440]: freelist init Oct 14 03:52:45 localhost ceph-osd[32440]: freelist _read_cfg Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs umount Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) close Oct 14 03:52:45 localhost podman[32705]: 2025-10-14 07:52:45.277933232 +0000 UTC m=+0.080434869 container remove c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_shtern, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Oct 14 03:52:45 localhost systemd[1]: libpod-conmon-c70a75485f6fdde862728a92f5928320af77c7b6b556aee8e1236a45e492f71a.scope: Deactivated successfully. Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Oct 14 03:52:45 localhost ceph-osd[32440]: bdev(0x55b0d2325180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs mount Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Oct 14 03:52:45 localhost ceph-osd[32440]: bluefs mount shared_bdev_used = 4718592 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: RocksDB version: 7.9.2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Git sha 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Compile date 2025-09-23 00:00:00 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DB SUMMARY Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DB Session ID: ZK2S4ZHXI3MPRH2BPQT9 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: CURRENT file: CURRENT Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: IDENTITY file: IDENTITY Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.error_if_exists: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.create_if_missing: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.flush_verify_memtable_count: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.env: 0x55b0d25b9ce0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.fs: LegacyFileSystem Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.info_log: 0x55b0d32a55c0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_file_opening_threads: 16 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.statistics: (nil) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_fsync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_log_file_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_manifest_file_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.log_file_time_to_roll: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.keep_log_file_num: 1000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.recycle_log_file_num: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_fallocate: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_mmap_reads: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_mmap_writes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_direct_reads: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.create_missing_column_families: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.db_log_dir: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_dir: db.wal Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_cache_numshardbits: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.WAL_ttl_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.WAL_size_limit_MB: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.manifest_preallocation_size: 4194304 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.is_fd_close_on_exec: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.advise_random_on_open: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.db_write_buffer_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_manager: 0x55b0d230f540 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.access_hint_on_compaction_start: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.random_access_max_buffer_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.use_adaptive_mutex: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.rate_limiter: (nil) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_recovery_mode: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_thread_tracking: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_pipelined_write: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.unordered_write: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_concurrent_memtable_write: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_thread_max_yield_usec: 100 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_thread_slow_yield_usec: 3 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.row_cache: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.avoid_flush_during_recovery: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_ingest_behind: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.two_write_queues: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.manual_wal_flush: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_compression: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.atomic_flush: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.persist_stats_to_disk: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_dbid_to_manifest: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.log_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.file_checksum_gen_factory: Unknown Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.best_efforts_recovery: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.allow_data_in_errors: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.db_host_id: __hostname__ Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enforce_single_del_contracts: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_background_jobs: 4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_background_compactions: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_subcompactions: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.avoid_flush_during_shutdown: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.writable_file_max_buffer_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.delayed_write_rate : 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_total_wal_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.stats_dump_period_sec: 600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.stats_persist_period_sec: 600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.stats_history_buffer_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_open_files: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bytes_per_sync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.wal_bytes_per_sync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.strict_bytes_per_sync: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_readahead_size: 2097152 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_background_flushes: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Compression algorithms supported: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kZSTD supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kXpressCompression supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kBZip2Compression supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kLZ4Compression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kZlibCompression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kLZ4HCCompression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: #011kSnappyCompression supported: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Fast CRC32 supported: Supported on x86 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DMutex implementation: pthread_mutex_t Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3376f40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fc2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3377240)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fd610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3377240)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fd610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost podman[32771]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.merge_operator: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_filter_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.sst_partitioner_factory: None Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b0d3377240)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b0d22fd610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.write_buffer_size: 16777216 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number: 64 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression: LZ4 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression: Disabled Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.num_levels: 7 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.level: 32767 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.enabled: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.arena_block_size: 1048576 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_support: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.bloom_locality: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.max_successive_merges: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.force_consistency_checks: 1 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.ttl: 2592000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_files: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.min_blob_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_size: 268435456 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2f47c283-c2dc-4450-bf01-44670425f63a Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365470329, "job": 1, "event": "recovery_started", "wal_files": [31]} Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365474516, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760428365, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f47c283-c2dc-4450-bf01-44670425f63a", "db_session_id": "ZK2S4ZHXI3MPRH2BPQT9", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365478165, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760428365, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f47c283-c2dc-4450-bf01-44670425f63a", "db_session_id": "ZK2S4ZHXI3MPRH2BPQT9", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365481717, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760428365, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f47c283-c2dc-4450-bf01-44670425f63a", "db_session_id": "ZK2S4ZHXI3MPRH2BPQT9", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760428365485212, "job": 1, "event": "recovery_finished"} Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Oct 14 03:52:45 localhost podman[32771]: 2025-10-14 07:52:45.491529487 +0000 UTC m=+0.082843103 container create 6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_blackwell, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , ceph=True, version=7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64) Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b0d338a380 Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: DB pointer 0x55b0d31fba00 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4 Oct 14 03:52:45 localhost ceph-osd[32440]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 03:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 460.80 MB usag Oct 14 03:52:45 localhost ceph-osd[32440]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Oct 14 03:52:45 localhost systemd[1]: Started libpod-conmon-6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597.scope. Oct 14 03:52:45 localhost ceph-osd[32440]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Oct 14 03:52:45 localhost ceph-osd[32440]: _get_class not permitted to load lua Oct 14 03:52:45 localhost systemd[1]: Started libcrun container. Oct 14 03:52:45 localhost ceph-osd[32440]: _get_class not permitted to load sdk Oct 14 03:52:45 localhost ceph-osd[32440]: _get_class not permitted to load test_remote_reads Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 load_pgs Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 load_pgs opened 0 pgs Oct 14 03:52:45 localhost ceph-osd[32440]: osd.3 0 log_to_monitors true Oct 14 03:52:45 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3[32436]: 2025-10-14T07:52:45.540+0000 7feb8dde1a80 -1 osd.3 0 log_to_monitors true Oct 14 03:52:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e7f5e59c453769cffd4ebe18bf6711386d7335862495fa1d63b6938cc676ed1/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:45 localhost podman[32771]: 2025-10-14 07:52:45.45919805 +0000 UTC m=+0.050511746 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e7f5e59c453769cffd4ebe18bf6711386d7335862495fa1d63b6938cc676ed1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e7f5e59c453769cffd4ebe18bf6711386d7335862495fa1d63b6938cc676ed1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:45 localhost podman[32771]: 2025-10-14 07:52:45.575393817 +0000 UTC m=+0.166707423 container init 6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_blackwell, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, GIT_CLEAN=True, ceph=True) Oct 14 03:52:45 localhost podman[32771]: 2025-10-14 07:52:45.582224249 +0000 UTC m=+0.173537855 container start 6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_blackwell, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=553, RELEASE=main, architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container) Oct 14 03:52:45 localhost podman[32771]: 2025-10-14 07:52:45.582375976 +0000 UTC m=+0.173689612 container attach 6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_blackwell, maintainer=Guillaume Abrioux , vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main) Oct 14 03:52:46 localhost ceph-osd[31500]: osd.0 11 state: booting -> active Oct 14 03:52:46 localhost systemd[1]: tmp-crun.jhHrD4.mount: Deactivated successfully. Oct 14 03:52:46 localhost systemd[1]: var-lib-containers-storage-overlay-afd64918196e3b0e2851689cf06db81b18537e1f2a534e93cc6a677f1d42428e-merged.mount: Deactivated successfully. Oct 14 03:52:46 localhost mystifying_blackwell[32974]: { Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "65efa31d-b0f9-4ff9-86fe-a4dc8772e327": { Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "ceph_fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "osd_id": 3, Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "osd_uuid": "65efa31d-b0f9-4ff9-86fe-a4dc8772e327", Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "type": "bluestore" Oct 14 03:52:46 localhost mystifying_blackwell[32974]: }, Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "671c314e-194c-4820-b83c-cca1cfcd5ad7": { Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "ceph_fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "osd_id": 0, Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "osd_uuid": "671c314e-194c-4820-b83c-cca1cfcd5ad7", Oct 14 03:52:46 localhost mystifying_blackwell[32974]: "type": "bluestore" Oct 14 03:52:46 localhost mystifying_blackwell[32974]: } Oct 14 03:52:46 localhost mystifying_blackwell[32974]: } Oct 14 03:52:46 localhost systemd[1]: libpod-6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597.scope: Deactivated successfully. Oct 14 03:52:46 localhost podman[32771]: 2025-10-14 07:52:46.187414563 +0000 UTC m=+0.778728259 container died 6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_blackwell, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public) Oct 14 03:52:46 localhost systemd[1]: var-lib-containers-storage-overlay-1e7f5e59c453769cffd4ebe18bf6711386d7335862495fa1d63b6938cc676ed1-merged.mount: Deactivated successfully. Oct 14 03:52:46 localhost podman[33039]: 2025-10-14 07:52:46.274006812 +0000 UTC m=+0.077151373 container remove 6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_blackwell, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 03:52:46 localhost systemd[1]: libpod-conmon-6411164060b4add6e1522b23adb4ffb00b2e708389aaa0c28667273406341597.scope: Deactivated successfully. Oct 14 03:52:46 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Oct 14 03:52:46 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 done with init, starting boot process Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 start_boot Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1 Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Oct 14 03:52:47 localhost ceph-osd[32440]: osd.3 0 bench count 12288000 bsize 4 KiB Oct 14 03:52:48 localhost podman[33167]: 2025-10-14 07:52:48.020947955 +0000 UTC m=+0.078608442 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, version=7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7) Oct 14 03:52:48 localhost ceph-osd[31500]: osd.0 13 crush map has features 288514051259236352, adjusting msgr requires for clients Oct 14 03:52:48 localhost ceph-osd[31500]: osd.0 13 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Oct 14 03:52:48 localhost ceph-osd[31500]: osd.0 13 crush map has features 3314933000852226048, adjusting msgr requires for osds Oct 14 03:52:48 localhost podman[33167]: 2025-10-14 07:52:48.150627248 +0000 UTC m=+0.208287715 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, vcs-type=git) Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 33.321 iops: 8530.117 elapsed_sec: 0.352 Oct 14 03:52:49 localhost ceph-osd[32440]: log_channel(cluster) log [WRN] : OSD bench result of 8530.116850 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 0 waiting for initial osdmap Oct 14 03:52:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3[32436]: 2025-10-14T07:52:49.294+0000 7feb8a575640 -1 osd.3 0 waiting for initial osdmap Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 crush map has features 288514051259236352, adjusting msgr requires for clients Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 crush map has features 3314933000852226048, adjusting msgr requires for osds Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 check_osdmap_features require_osd_release unknown -> reef Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 set_numa_affinity not setting numa affinity Oct 14 03:52:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-osd-3[32436]: 2025-10-14T07:52:49.310+0000 7feb8538a640 -1 osd.3 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Oct 14 03:52:49 localhost ceph-osd[32440]: osd.3 14 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Oct 14 03:52:50 localhost podman[33368]: Oct 14 03:52:50 localhost podman[33368]: 2025-10-14 07:52:50.125262622 +0000 UTC m=+0.059258869 container create 0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_darwin, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , version=7, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64) Oct 14 03:52:50 localhost systemd[1]: Started libpod-conmon-0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6.scope. Oct 14 03:52:50 localhost systemd[1]: Started libcrun container. Oct 14 03:52:50 localhost podman[33368]: 2025-10-14 07:52:50.182653691 +0000 UTC m=+0.116649998 container init 0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_darwin, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, release=553, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True) Oct 14 03:52:50 localhost systemd[1]: tmp-crun.A7N8Q7.mount: Deactivated successfully. Oct 14 03:52:50 localhost podman[33368]: 2025-10-14 07:52:50.191960921 +0000 UTC m=+0.125957168 container start 0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_darwin, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 03:52:50 localhost podman[33368]: 2025-10-14 07:52:50.192179421 +0000 UTC m=+0.126175738 container attach 0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_darwin, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_CLEAN=True, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public) Oct 14 03:52:50 localhost great_darwin[33383]: 167 167 Oct 14 03:52:50 localhost systemd[1]: libpod-0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6.scope: Deactivated successfully. Oct 14 03:52:50 localhost podman[33368]: 2025-10-14 07:52:50.194356674 +0000 UTC m=+0.128352951 container died 0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_darwin, version=7, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, release=553, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 03:52:50 localhost podman[33368]: 2025-10-14 07:52:50.09663998 +0000 UTC m=+0.030636257 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:50 localhost ceph-osd[32440]: osd.3 15 state: booting -> active Oct 14 03:52:50 localhost podman[33388]: 2025-10-14 07:52:50.285309648 +0000 UTC m=+0.081191604 container remove 0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_darwin, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553, vcs-type=git, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 03:52:50 localhost systemd[1]: libpod-conmon-0ba1668eb179fa57bcf73474b7ded8bec2e2c2b32b72ff9796ca7dcc2506c4f6.scope: Deactivated successfully. Oct 14 03:52:50 localhost podman[33407]: Oct 14 03:52:50 localhost podman[33407]: 2025-10-14 07:52:50.473168948 +0000 UTC m=+0.075211322 container create a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_hypatia, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, io.openshift.expose-services=, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=) Oct 14 03:52:50 localhost systemd[1]: Started libpod-conmon-a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d.scope. Oct 14 03:52:50 localhost systemd[1]: Started libcrun container. Oct 14 03:52:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c8c1d49facd5cf06212b5f9629a76b11e0d59fe9975db38a97c7c126ee7d79/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:50 localhost podman[33407]: 2025-10-14 07:52:50.442133943 +0000 UTC m=+0.044176377 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 03:52:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c8c1d49facd5cf06212b5f9629a76b11e0d59fe9975db38a97c7c126ee7d79/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c8c1d49facd5cf06212b5f9629a76b11e0d59fe9975db38a97c7c126ee7d79/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 03:52:50 localhost podman[33407]: 2025-10-14 07:52:50.561977232 +0000 UTC m=+0.164019596 container init a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_hypatia, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main) Oct 14 03:52:50 localhost podman[33407]: 2025-10-14 07:52:50.57213189 +0000 UTC m=+0.174174264 container start a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_hypatia, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux ) Oct 14 03:52:50 localhost podman[33407]: 2025-10-14 07:52:50.572388183 +0000 UTC m=+0.174430597 container attach a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_hypatia, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=553, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main) Oct 14 03:52:51 localhost systemd[1]: var-lib-containers-storage-overlay-c6a91083463a1848b3f62e2b95da1d16e5f677eb90c6cfeea2c695388c960e28-merged.mount: Deactivated successfully. Oct 14 03:52:51 localhost sleepy_hypatia[33422]: [ Oct 14 03:52:51 localhost sleepy_hypatia[33422]: { Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "available": false, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "ceph_device": false, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "lsm_data": {}, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "lvs": [], Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "path": "/dev/sr0", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "rejected_reasons": [ Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "Insufficient space (<5GB)", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "Has a FileSystem" Oct 14 03:52:51 localhost sleepy_hypatia[33422]: ], Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "sys_api": { Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "actuators": null, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "device_nodes": "sr0", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "human_readable_size": "482.00 KB", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "id_bus": "ata", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "model": "QEMU DVD-ROM", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "nr_requests": "2", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "partitions": {}, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "path": "/dev/sr0", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "removable": "1", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "rev": "2.5+", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "ro": "0", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "rotational": "1", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "sas_address": "", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "sas_device_handle": "", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "scheduler_mode": "mq-deadline", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "sectors": 0, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "sectorsize": "2048", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "size": 493568.0, Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "support_discard": "0", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "type": "disk", Oct 14 03:52:51 localhost sleepy_hypatia[33422]: "vendor": "QEMU" Oct 14 03:52:51 localhost sleepy_hypatia[33422]: } Oct 14 03:52:51 localhost sleepy_hypatia[33422]: } Oct 14 03:52:51 localhost sleepy_hypatia[33422]: ] Oct 14 03:52:51 localhost systemd[1]: libpod-a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d.scope: Deactivated successfully. Oct 14 03:52:51 localhost podman[33407]: 2025-10-14 07:52:51.473141703 +0000 UTC m=+1.075184067 container died a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_hypatia, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , ceph=True, GIT_BRANCH=main, RELEASE=main, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55) Oct 14 03:52:51 localhost systemd[1]: var-lib-containers-storage-overlay-27c8c1d49facd5cf06212b5f9629a76b11e0d59fe9975db38a97c7c126ee7d79-merged.mount: Deactivated successfully. Oct 14 03:52:51 localhost podman[34879]: 2025-10-14 07:52:51.556254637 +0000 UTC m=+0.077501701 container remove a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_hypatia, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, release=553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 03:52:51 localhost systemd[1]: libpod-conmon-a5ff4fdf3f83cc2345d8603a51e4919d49a391e3c5d7a91754c013458df1fa2d.scope: Deactivated successfully. Oct 14 03:52:51 localhost ceph-osd[32440]: osd.3 pg_epoch: 15 pg[1.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=13/0 les/c/f=14/0/0 sis=15) [1,2,3] r=2 lpr=15 pi=[13,15)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 03:53:00 localhost systemd[1]: tmp-crun.bdATUg.mount: Deactivated successfully. Oct 14 03:53:00 localhost podman[35007]: 2025-10-14 07:53:00.216608531 +0000 UTC m=+0.090681112 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, release=553) Oct 14 03:53:00 localhost podman[35007]: 2025-10-14 07:53:00.345417533 +0000 UTC m=+0.219490074 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , version=7, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 03:53:30 localhost systemd[25973]: Starting Mark boot as successful... Oct 14 03:53:30 localhost systemd[25973]: Finished Mark boot as successful. Oct 14 03:54:02 localhost systemd[1]: tmp-crun.QBoWEZ.mount: Deactivated successfully. Oct 14 03:54:02 localhost podman[35185]: 2025-10-14 07:54:02.125760565 +0000 UTC m=+0.083840415 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55) Oct 14 03:54:02 localhost podman[35185]: 2025-10-14 07:54:02.248224893 +0000 UTC m=+0.206304723 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=) Oct 14 03:54:15 localhost systemd[1]: session-13.scope: Deactivated successfully. Oct 14 03:54:15 localhost systemd[1]: session-13.scope: Consumed 20.905s CPU time. Oct 14 03:54:15 localhost systemd-logind[760]: Session 13 logged out. Waiting for processes to exit. Oct 14 03:54:15 localhost systemd-logind[760]: Removed session 13. Oct 14 03:56:30 localhost systemd[25973]: Created slice User Background Tasks Slice. Oct 14 03:56:30 localhost systemd[25973]: Starting Cleanup of User's Temporary Files and Directories... Oct 14 03:56:30 localhost systemd[25973]: Finished Cleanup of User's Temporary Files and Directories. Oct 14 03:56:34 localhost sshd[35482]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:57:40 localhost sshd[35558]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:57:40 localhost systemd-logind[760]: New session 27 of user zuul. Oct 14 03:57:40 localhost systemd[1]: Started Session 27 of User zuul. Oct 14 03:57:40 localhost python3[35606]: ansible-ansible.legacy.ping Invoked with data=pong Oct 14 03:57:41 localhost python3[35651]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 03:57:42 localhost python3[35671]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486733.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Oct 14 03:57:42 localhost python3[35727]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 03:57:43 localhost python3[35770]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760428662.3827138-66574-65781828654276/source _original_basename=tmppiu8whje follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:57:43 localhost python3[35800]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:57:43 localhost python3[35816]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:57:44 localhost python3[35832]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:57:45 localhost python3[35848]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUv/ZB171sShkvmUwM4/A+38mOKHSoVqmUnoFRrcde+TmaD2jOKfnaBsMdk2YTdAdiPwM8PX7LYcOftZjXZ92Uqg/gQ0pshmFBVtIcoN0HEQlFtMQltRrBVPG+qHK5UOF2bUImKqqFx3uTPSmteSVgJtwvFqp/51YTUibYgQBWJPCcOSze95nxendWi6PoXzvorqCyVS44Llj4LmLChBJeqAI5cWs2EeDhQ4Tw8F33iKpBg8WjZAbQVbe2KIQYURMtANtjUJ0Yg5RTArSq57504iqodB4+ynahul8Dp5+TocLZTPu5orcqRGqWDe7CN5pc1eXZQuNNZ0jW59y52GY+ox+WCmp1qvB7TQzhc/r+kAVmT8VNTVUvC5TBGcIw3yxI7lzrd03zpenSL3oyJnFN4SXCeAA8YcXlz7ySaO9YAtbCSdkgj8QJCiykvalRm17F4d4aRX5+rtfEm+WG670vF6FRNNo5OTXTK2Ja84pej1bjzDBvEz81D1EqnHybfJ0= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:57:45 localhost python3[35862]: ansible-ping Invoked with data=pong Oct 14 03:57:56 localhost sshd[35864]: main: sshd: ssh-rsa algorithm is disabled Oct 14 03:57:56 localhost systemd-logind[760]: New session 28 of user tripleo-admin. Oct 14 03:57:56 localhost systemd[1]: Created slice User Slice of UID 1003. Oct 14 03:57:56 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Oct 14 03:57:56 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Oct 14 03:57:56 localhost systemd[1]: Starting User Manager for UID 1003... Oct 14 03:57:56 localhost systemd[35868]: Queued start job for default target Main User Target. Oct 14 03:57:56 localhost systemd[35868]: Created slice User Application Slice. Oct 14 03:57:56 localhost systemd[35868]: Started Mark boot as successful after the user session has run 2 minutes. Oct 14 03:57:56 localhost systemd[35868]: Started Daily Cleanup of User's Temporary Directories. Oct 14 03:57:56 localhost systemd[35868]: Reached target Paths. Oct 14 03:57:56 localhost systemd[35868]: Reached target Timers. Oct 14 03:57:56 localhost systemd[35868]: Starting D-Bus User Message Bus Socket... Oct 14 03:57:56 localhost systemd[35868]: Starting Create User's Volatile Files and Directories... Oct 14 03:57:56 localhost systemd[35868]: Finished Create User's Volatile Files and Directories. Oct 14 03:57:56 localhost systemd[35868]: Listening on D-Bus User Message Bus Socket. Oct 14 03:57:56 localhost systemd[35868]: Reached target Sockets. Oct 14 03:57:56 localhost systemd[35868]: Reached target Basic System. Oct 14 03:57:56 localhost systemd[35868]: Reached target Main User Target. Oct 14 03:57:56 localhost systemd[35868]: Startup finished in 131ms. Oct 14 03:57:56 localhost systemd[1]: Started User Manager for UID 1003. Oct 14 03:57:56 localhost systemd[1]: Started Session 28 of User tripleo-admin. Oct 14 03:57:57 localhost python3[35929]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Oct 14 03:58:02 localhost python3[35949]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Oct 14 03:58:03 localhost python3[35965]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Oct 14 03:58:03 localhost python3[36013]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.2tbh23mctmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:58:04 localhost python3[36043]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.2tbh23mctmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:58:05 localhost python3[36059]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.2tbh23mctmphosts insertbefore=BOF block=172.17.0.106 np0005486731.localdomain np0005486731#012172.18.0.106 np0005486731.storage.localdomain np0005486731.storage#012172.20.0.106 np0005486731.storagemgmt.localdomain np0005486731.storagemgmt#012172.17.0.106 np0005486731.internalapi.localdomain np0005486731.internalapi#012172.19.0.106 np0005486731.tenant.localdomain np0005486731.tenant#012192.168.122.106 np0005486731.ctlplane.localdomain np0005486731.ctlplane#012172.17.0.107 np0005486732.localdomain np0005486732#012172.18.0.107 np0005486732.storage.localdomain np0005486732.storage#012172.20.0.107 np0005486732.storagemgmt.localdomain np0005486732.storagemgmt#012172.17.0.107 np0005486732.internalapi.localdomain np0005486732.internalapi#012172.19.0.107 np0005486732.tenant.localdomain np0005486732.tenant#012192.168.122.107 np0005486732.ctlplane.localdomain np0005486732.ctlplane#012172.17.0.108 np0005486733.localdomain np0005486733#012172.18.0.108 np0005486733.storage.localdomain np0005486733.storage#012172.20.0.108 np0005486733.storagemgmt.localdomain np0005486733.storagemgmt#012172.17.0.108 np0005486733.internalapi.localdomain np0005486733.internalapi#012172.19.0.108 np0005486733.tenant.localdomain np0005486733.tenant#012192.168.122.108 np0005486733.ctlplane.localdomain np0005486733.ctlplane#012172.17.0.103 np0005486728.localdomain np0005486728#012172.18.0.103 np0005486728.storage.localdomain np0005486728.storage#012172.20.0.103 np0005486728.storagemgmt.localdomain np0005486728.storagemgmt#012172.17.0.103 np0005486728.internalapi.localdomain np0005486728.internalapi#012172.19.0.103 np0005486728.tenant.localdomain np0005486728.tenant#012192.168.122.103 np0005486728.ctlplane.localdomain np0005486728.ctlplane#012172.17.0.104 np0005486729.localdomain np0005486729#012172.18.0.104 np0005486729.storage.localdomain np0005486729.storage#012172.20.0.104 np0005486729.storagemgmt.localdomain np0005486729.storagemgmt#012172.17.0.104 np0005486729.internalapi.localdomain np0005486729.internalapi#012172.19.0.104 np0005486729.tenant.localdomain np0005486729.tenant#012192.168.122.104 np0005486729.ctlplane.localdomain np0005486729.ctlplane#012172.17.0.105 np0005486730.localdomain np0005486730#012172.18.0.105 np0005486730.storage.localdomain np0005486730.storage#012172.20.0.105 np0005486730.storagemgmt.localdomain np0005486730.storagemgmt#012172.17.0.105 np0005486730.internalapi.localdomain np0005486730.internalapi#012172.19.0.105 np0005486730.tenant.localdomain np0005486730.tenant#012192.168.122.105 np0005486730.ctlplane.localdomain np0005486730.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.210 overcloud.storage.localdomain#012172.20.0.247 overcloud.storagemgmt.localdomain#012172.17.0.162 overcloud.internalapi.localdomain#012172.21.0.142 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:58:05 localhost python3[36075]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.2tbh23mctmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:58:06 localhost python3[36092]: ansible-file Invoked with path=/tmp/ansible.2tbh23mctmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:58:07 localhost python3[36108]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:58:07 localhost python3[36125]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:58:12 localhost python3[36223]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:58:12 localhost python3[36240]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:59:18 localhost kernel: SELinux: Converting 2699 SID table entries... Oct 14 03:59:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 03:59:18 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 03:59:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 03:59:18 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 03:59:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 03:59:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 03:59:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 03:59:19 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=6 res=1 Oct 14 03:59:19 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:59:19 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 03:59:19 localhost systemd[1]: Reloading. Oct 14 03:59:19 localhost systemd-sysv-generator[37082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:59:19 localhost systemd-rc-local-generator[37074]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:59:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:59:19 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 03:59:19 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 03:59:19 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 03:59:19 localhost systemd[1]: run-r7aa6c3939ab04c91a2a79e9be2a30a59.service: Deactivated successfully. Oct 14 03:59:24 localhost python3[37533]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:25 localhost python3[37672]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 03:59:25 localhost systemd[1]: Reloading. Oct 14 03:59:25 localhost systemd-rc-local-generator[37697]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:59:25 localhost systemd-sysv-generator[37700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:59:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:59:27 localhost python3[37726]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:59:28 localhost python3[37742]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:28 localhost python3[37759]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Oct 14 03:59:30 localhost python3[37777]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:59:30 localhost python3[37795]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:59:31 localhost python3[37813]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 03:59:31 localhost systemd[1]: Reloading Network Manager... Oct 14 03:59:31 localhost NetworkManager[5977]: [1760428771.3214] audit: op="reload" arg="0" pid=37816 uid=0 result="success" Oct 14 03:59:31 localhost NetworkManager[5977]: [1760428771.3218] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Oct 14 03:59:31 localhost NetworkManager[5977]: [1760428771.3218] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Oct 14 03:59:31 localhost systemd[1]: Reloaded Network Manager. Oct 14 03:59:31 localhost python3[37832]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:32 localhost python3[37849]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:59:32 localhost python3[37867]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:59:33 localhost python3[37883]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:59:33 localhost python3[37899]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Oct 14 03:59:34 localhost python3[37915]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:59:35 localhost python3[37931]: ansible-blockinfile Invoked with path=/tmp/ansible._i30egap block=[192.168.122.106]*,[np0005486731.ctlplane.localdomain]*,[172.17.0.106]*,[np0005486731.internalapi.localdomain]*,[172.18.0.106]*,[np0005486731.storage.localdomain]*,[172.20.0.106]*,[np0005486731.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005486731.tenant.localdomain]*,[np0005486731.localdomain]*,[np0005486731]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCirnE0NbUtG1POhhB+AhKCgxEghhJb/WUMq5UfTpoI7+sU48jNxRyEvlJ9WLGLD82QYzFzvYceQHGF3QzqwIybk7JFKNvYYEOkz9hG//Xjh6A/3qZ0QptW0dWlBpSs0CuOATe19vBa98AfD1qNMYOAwwjlRDvjVW17VALcKjVesDK4LNkVfCSX9cK7Gdd1LfEkwQwxiTTZeSd91DSx5XIm3hz9RcMpxpCgc3snA81FXTTb4G1v39rycXuWjjlp/2B4CRlgPrIb6u1X/hkN0uxSMiwMQG7fZladvZi8RTRyt2EmTR0l8f0eDeuN1gLfOFVlQSfj33xH8/2G2s4IUhbudf732i4GKxgy5WBMiH2DVHzoO7LGdKlYKRvxgNG8qx68hOAzHokMnmaHnKlTsXNPph6MD/ufoeHaEG35xMkewSoY70MzDny/Z9lllfTTs+Yi5YEO22s5EoS6KK9C1+WShW9TELIuj5X8P1VeD+LlKJIwbLQzEHLc1irbnJ2RgUc=#012[192.168.122.107]*,[np0005486732.ctlplane.localdomain]*,[172.17.0.107]*,[np0005486732.internalapi.localdomain]*,[172.18.0.107]*,[np0005486732.storage.localdomain]*,[172.20.0.107]*,[np0005486732.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005486732.tenant.localdomain]*,[np0005486732.localdomain]*,[np0005486732]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDM+kpIg8Y4xlC9n9pfBoVDeeU3WOfZT4Yf4ib8bb9MSMyOwJpLVbkpe3nLg73heYlLISwD3ojybTo9jDmNS7Pq+q5bGue4oqLk7f5B7IMwrmkfzjKYQpGMLL7FdErlDs6IP2jQ82E+uJ7M54Kv5g0rr+blVacsnYetzjJM26r3UcKTdOjJyIHuvQWa4IzNJRydr8s9//7Orf7269xlmVoqyAkcrhzcewCVeaK7VOrIcy3oKzOtwYpQmSxUumuX5rxE8KoCn4Ag0V3Mpp7hqN2xrry1hJN1J7yXSYaF1pc4MJKvCK6k0VqK4dY6CppsQvx2HW1s/Ib5UxJ/+JypjsqwYcSL7BSesfCtHtY8Tn1bbI+nm+nbMw1VIECq94FvZldDnxbaCQDP7dkFxqJaZebSFX+XAsRqJq4M8/rAm2gFUtCisiggasuEgfBfODBwb5+EYGNBCS/72Xs3b1h+hoMh0XCocdkTpzbr40FK6djLBdZXBAt7/Vwy0fTpC9G8H+s=#012[192.168.122.108]*,[np0005486733.ctlplane.localdomain]*,[172.17.0.108]*,[np0005486733.internalapi.localdomain]*,[172.18.0.108]*,[np0005486733.storage.localdomain]*,[172.20.0.108]*,[np0005486733.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005486733.tenant.localdomain]*,[np0005486733.localdomain]*,[np0005486733]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDPo0GfacWT5Pc+C+u+omIcLodqLCmBuNDNfCjeb037QgP4jmD3LwkBVK9lXeF6bKJmM0PzOPagPFh4T7FwHNF7Np+V7e+YWSARFeetHnxYmMZdWYyfKTaZrS25xRraxyGrunWniIhAKFUaTz7e6OjUqNe25eVURCgpvQnsWeDwm/Gk9GfpfMCIFRtF7phpUKzSaz/8IpyLG1IzRSMsUkEtoKFxbAkuuJrkD4IWeWvEqn02yWC2WFGEdpQu8kcnxIshwqf9bEa7rYrjDTR++5AuztTSbppQL+8RIclxDR3uCVxzprf9Pj2C0e2X7TVKUs1tlduvrPK7uS10NGx3CK5iUe+uX+4V+jNrpe35OBv2vzdbzR+W6ciNtdy2lWLTou66Fm+/a3XwfJQb66dWQrLIyc6T64D8BysHjA8ER5TZ7N8AZoFZ8tNRzPgNWFZhjzoXdYisTvN9CjcpLgVpzekjeQS4BNNzh7bs+FPdB49TSf65NLzBIhWNqHT8weDoO58=#012[192.168.122.103]*,[np0005486728.ctlplane.localdomain]*,[172.17.0.103]*,[np0005486728.internalapi.localdomain]*,[172.18.0.103]*,[np0005486728.storage.localdomain]*,[172.20.0.103]*,[np0005486728.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005486728.tenant.localdomain]*,[np0005486728.localdomain]*,[np0005486728]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDr2nlXCVxp/8oDgdtx78rfaKpbpZ2BVPZ6HGLZUj0EA3A0bpv/vCkjK3KQT3TI7v1XfpgRbj08G0BbDhcTce9c8drn6X7lMpxvdMYZKKMTHnRs3mq9RsfEuWH3Q8Aa22LiA7rLwzVM2bbdbUcx/55pt3si8ariZ274Pzbprq7RrthEdE9xo5SDFIi+VJNQfQa+igaLblAAoG8qz+WChOAEmghfOAe4F7vBmidVxT92aYUE03zpWtqox4fE1U2dC0FMJ6Jro1ONj8KKCyEL+oLEbWFbPR4ynCyRvGaMIYh+9scB5yCf7vgPXNqu8sG+gR9i5wG43Nnh+76+XX/k+4Vyw/VeNANTjdiGvBcWmj1LLMDetoxZ5AdfklGaQq5qmrIvGqvIAGd7NgdwwWWw2umuIru3mi/5Z0H5I1uhLgTdknibTJSkhkkt/sBiBuyAXM3/HneFzlxDlYgA1xwdZeNnfiH010AO2W8pkWmWsYdMOEOBsM3SmGWtUuGKApwHcs8=#012[192.168.122.104]*,[np0005486729.ctlplane.localdomain]*,[172.17.0.104]*,[np0005486729.internalapi.localdomain]*,[172.18.0.104]*,[np0005486729.storage.localdomain]*,[172.20.0.104]*,[np0005486729.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005486729.tenant.localdomain]*,[np0005486729.localdomain]*,[np0005486729]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCuTpRqp6mqKsQmynNLG8q8Bb4GSKNLRdYVfi81dV1W3aIPFsswo/C9+5nbZA1YVPY02cdXFps4EmIQl2tQ0sKmdo4HGexnhUJjKuyXFTu0kCYUasXCE5+sSjRVUCF4RfD3+6jQ9w6hHM1R3JkkhPZtKs4ykqH+8Gr2B918BdDuVaujfMmVWMv8M46JDuDO9vGPlWpM+xZkFZ1zjG2I2UIvWLkEnVdta7QIgxIPTlX7rOokadGrkAcIYb87wONg2vJiTPWO4ht4yHUIvTGNHSTmCXK0sdQLiZzjR2P/k67s1KMeWjaWAe3NXygnpvgENx9Qf9NkOYhvz8j+xZXat4Pa/I38V79XAjE3nWEF/KM6a4nKK9Lz5GXOvsQ+LIXBBY6HSAqBY4Lc21xwCJxEoO5Iftn56HzDFA+iyex5FMeT12ANKmVF9D+NHdaiZ3d5iPW6cOPqph1UjWsofejhEt0dxmCbippl74SWTZey9dQ3TKM9BGf2QfH1GvasiC+CsVU=#012[192.168.122.105]*,[np0005486730.ctlplane.localdomain]*,[172.17.0.105]*,[np0005486730.internalapi.localdomain]*,[172.18.0.105]*,[np0005486730.storage.localdomain]*,[172.20.0.105]*,[np0005486730.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005486730.tenant.localdomain]*,[np0005486730.localdomain]*,[np0005486730]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDtk5xAqdm3oDp772fF0Tcpwt7lZCIcJjfcDVjKALPT5gaSA/ogGG08ba03OQjSa4fktVIIeYQdRVzWIscOCoWMDa+vnXRStoi9DI+3rLz3nQvH190s8hPq6KxWR8DzGiqF8GwF1Kfuc7wz4c9jdElv6iWUfZuxCSLQfPSRYOw9IIII6knfTuRjQAIdmUJwnjN9K5n2n8rISg0VPd9kUHZR8jL+zFPsv5XkwfW/t5CEMmx6WG8w8Q6gY+yoeU4qINcRzFjKx/s6ParctRSYzJDPYEyhrgqQUesBDU4nyxRDpFilkeZI46TfqC9bG5bKTVfVy6qnAgkt4vg6buwszUTRdx6a0v68zWAwKGNAHRKS/HQ/CRe7CHYqsob7w41V4RvOtP5kz+dniINeT/K71sL3ZwcciRuGM10ayjaxBw7HOMJHi9RWrPWads3ubzTErcORb9mdWdlSomqfEGB8Ig/tKeFTipyN39TKKHLD+o6Tjnxqb3imMsE1kZWQOzHbFhE=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:59:35 localhost python3[37947]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible._i30egap' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:36 localhost python3[37965]: ansible-file Invoked with path=/tmp/ansible._i30egap state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 03:59:37 localhost python3[37981]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 03:59:37 localhost python3[37997]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:37 localhost python3[38015]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:38 localhost python3[38034]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Oct 14 03:59:40 localhost python3[38171]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:41 localhost python3[38188]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 03:59:44 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Oct 14 03:59:44 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Oct 14 03:59:44 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:59:44 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 03:59:44 localhost systemd[1]: Reloading. Oct 14 03:59:44 localhost systemd-rc-local-generator[38273]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 03:59:44 localhost systemd-sysv-generator[38280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 03:59:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 03:59:44 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 03:59:44 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Oct 14 03:59:44 localhost systemd[1]: tuned.service: Deactivated successfully. Oct 14 03:59:44 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Oct 14 03:59:44 localhost systemd[1]: tuned.service: Consumed 1.968s CPU time. Oct 14 03:59:44 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Oct 14 03:59:44 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 03:59:44 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 03:59:44 localhost systemd[1]: run-rac8f5d99aabf4e7bb039def551e6ebd2.service: Deactivated successfully. Oct 14 03:59:46 localhost systemd[1]: Started Dynamic System Tuning Daemon. Oct 14 03:59:46 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 03:59:46 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 03:59:46 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 03:59:46 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 03:59:46 localhost systemd[1]: run-r82837075dc104311b33508c95e2623bf.service: Deactivated successfully. Oct 14 03:59:47 localhost python3[38626]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 03:59:47 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Oct 14 03:59:47 localhost systemd[1]: tuned.service: Deactivated successfully. Oct 14 03:59:47 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Oct 14 03:59:47 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Oct 14 03:59:49 localhost systemd[1]: Started Dynamic System Tuning Daemon. Oct 14 03:59:49 localhost python3[38821]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:50 localhost python3[38838]: ansible-slurp Invoked with src=/etc/tuned/active_profile Oct 14 03:59:50 localhost python3[38854]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:59:51 localhost python3[38870]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:53 localhost python3[38890]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 03:59:53 localhost python3[38907]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 03:59:56 localhost python3[38923]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:01 localhost python3[38939]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:02 localhost python3[38989]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:02 localhost python3[39034]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428801.8358715-71110-134299507867699/source _original_basename=tmpr0v80wje follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:03 localhost python3[39064]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:03 localhost python3[39112]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:04 localhost python3[39155]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428803.4482324-71243-152510376343909/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=3ad8d0209f3c580b846ebda0d1ccff7b6a77b702 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:04 localhost python3[39217]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:05 localhost python3[39260]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428804.340473-71302-72972842077206/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=cd6b22568046e0a42e6fe7d93359257b42ca6ee5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:05 localhost python3[39322]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:06 localhost python3[39365]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428805.2961419-71302-94812466264497/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=988045890eab4c878cbeebf6fe69706ab2c2cfec backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:06 localhost python3[39427]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:07 localhost python3[39470]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428806.2043009-71302-130081884649462/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=1bd75eeb71ad8a06f7ad5bd2e02e7279e09e867f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:07 localhost python3[39532]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:07 localhost python3[39575]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428807.1747-71302-224631530467337/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:08 localhost python3[39637]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:08 localhost python3[39680]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428808.0770738-71302-167323442338829/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=2eb02fb4bdbeff75de0f38c02672a609439b5b00 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:09 localhost python3[39742]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:09 localhost python3[39785]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428808.9861393-71302-247959593829786/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:10 localhost python3[39847]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:10 localhost python3[39890]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428809.8137424-71302-186091482923405/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=8391a3a377145b325f1f0c494e2f35795c60fdac backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:10 localhost python3[39952]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:11 localhost python3[39995]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428810.6072168-71302-269055401571630/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:11 localhost python3[40087]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:12 localhost python3[40155]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428811.3702383-71302-3085236172905/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:12 localhost python3[40223]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:12 localhost python3[40281]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428812.2361991-71302-3196884640545/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=ac6ba2437b60b77f26fbe88089c3be0fef46eebf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:13 localhost python3[40311]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:00:14 localhost python3[40359]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:00:14 localhost python3[40402]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428814.2531219-72096-140338549597719/source _original_basename=tmpo3ahwyoh follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:00:19 localhost python3[40432]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 04:00:21 localhost python3[40493]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:22 localhost sshd[40495]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:00:25 localhost python3[40512]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:30 localhost python3[40529]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:30 localhost systemd[35868]: Starting Mark boot as successful... Oct 14 04:00:30 localhost systemd[35868]: Finished Mark boot as successful. Oct 14 04:00:30 localhost python3[40553]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:35 localhost python3[40570]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:35 localhost python3[40593]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:40 localhost python3[40610]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:44 localhost python3[40627]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:45 localhost python3[40650]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:49 localhost python3[40667]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:54 localhost python3[40684]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:54 localhost python3[40707]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:00:59 localhost python3[40724]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:03 localhost python3[40752]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:03 localhost python3[40775]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:08 localhost python3[40792]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:14 localhost python3[40870]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:14 localhost python3[40918]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:15 localhost python3[40936]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpgqd4jfhv recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:15 localhost python3[40966]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:16 localhost python3[41029]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:16 localhost python3[41047]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:17 localhost python3[41109]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:17 localhost python3[41127]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:18 localhost python3[41189]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:18 localhost python3[41207]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:18 localhost python3[41269]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:19 localhost python3[41287]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:19 localhost python3[41349]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:19 localhost python3[41367]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:20 localhost python3[41429]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:20 localhost python3[41447]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:21 localhost python3[41509]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:21 localhost python3[41527]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:22 localhost python3[41589]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:22 localhost python3[41607]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:22 localhost python3[41669]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:23 localhost python3[41687]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:23 localhost python3[41749]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:23 localhost python3[41767]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:24 localhost python3[41829]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:24 localhost python3[41847]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:25 localhost python3[41877]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:01:25 localhost python3[41925]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:26 localhost python3[41943]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmp7ojlvl94 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:28 localhost python3[41973]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:01:33 localhost python3[41990]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:01:35 localhost python3[42008]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:01:35 localhost python3[42026]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:01:35 localhost systemd[1]: Reloading. Oct 14 04:01:35 localhost systemd-sysv-generator[42059]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:01:35 localhost systemd-rc-local-generator[42054]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:01:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:01:35 localhost systemd[1]: Starting Netfilter Tables... Oct 14 04:01:36 localhost systemd[1]: Finished Netfilter Tables. Oct 14 04:01:36 localhost python3[42116]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:37 localhost python3[42159]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428896.3602686-74942-168038171278314/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:37 localhost python3[42189]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:37 localhost python3[42207]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:38 localhost python3[42256]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:38 localhost python3[42299]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428898.149952-75057-279052838213843/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:39 localhost python3[42361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:39 localhost python3[42404]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428899.0969546-75118-118257016715505/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:40 localhost python3[42466]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:40 localhost python3[42509]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428900.127376-75189-184314384545035/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:41 localhost python3[42571]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:41 localhost python3[42614]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428900.9935641-75226-11640933964882/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:42 localhost python3[42676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:43 localhost python3[42719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428901.9425552-75278-173962127636358/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:43 localhost python3[42749]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:44 localhost python3[42814]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:01:44 localhost python3[42831]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:45 localhost python3[42848]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:45 localhost python3[42867]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:45 localhost python3[42883]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:46 localhost python3[42899]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:46 localhost python3[42915]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Oct 14 04:01:47 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=7 res=1 Oct 14 04:01:47 localhost python3[42935]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Oct 14 04:01:48 localhost kernel: SELinux: Converting 2703 SID table entries... Oct 14 04:01:48 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 04:01:48 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 04:01:48 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 04:01:48 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 04:01:48 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 04:01:48 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 04:01:48 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 04:01:48 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=8 res=1 Oct 14 04:01:49 localhost python3[42956]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Oct 14 04:01:49 localhost kernel: SELinux: Converting 2703 SID table entries... Oct 14 04:01:49 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 04:01:49 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 04:01:49 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 04:01:49 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 04:01:49 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 04:01:49 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 04:01:49 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 04:01:50 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=9 res=1 Oct 14 04:01:50 localhost python3[42978]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Oct 14 04:01:50 localhost kernel: SELinux: Converting 2703 SID table entries... Oct 14 04:01:50 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 04:01:50 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 04:01:50 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 04:01:50 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 04:01:51 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 04:01:51 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 04:01:51 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 04:01:51 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=10 res=1 Oct 14 04:01:51 localhost python3[42999]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:51 localhost python3[43015]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:52 localhost python3[43031]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:52 localhost python3[43047]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:01:52 localhost python3[43063]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:01:53 localhost python3[43080]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:01:57 localhost python3[43097]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:58 localhost python3[43145]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:01:58 localhost python3[43188]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428917.8317785-76158-50186392321606/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 04:01:59 localhost python3[43218]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 04:01:59 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 14 04:01:59 localhost systemd[1]: Stopped Load Kernel Modules. Oct 14 04:01:59 localhost systemd[1]: Stopping Load Kernel Modules... Oct 14 04:01:59 localhost systemd[1]: Starting Load Kernel Modules... Oct 14 04:01:59 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 14 04:01:59 localhost systemd-modules-load[43221]: Inserted module 'br_netfilter' Oct 14 04:01:59 localhost kernel: Bridge firewalling registered Oct 14 04:01:59 localhost systemd-modules-load[43221]: Module 'msr' is built in Oct 14 04:01:59 localhost systemd[1]: Finished Load Kernel Modules. Oct 14 04:01:59 localhost python3[43272]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:00 localhost python3[43315]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428919.3776453-76256-52068172682551/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:00 localhost python3[43345]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:00 localhost python3[43362]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:01 localhost python3[43380]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:01 localhost python3[43398]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:02 localhost python3[43415]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:03 localhost python3[43432]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:03 localhost python3[43449]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:03 localhost python3[43467]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:03 localhost python3[43485]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:04 localhost python3[43503]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:04 localhost python3[43521]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:04 localhost python3[43539]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:05 localhost python3[43557]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:05 localhost python3[43575]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:05 localhost python3[43592]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:06 localhost python3[43609]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:06 localhost python3[43626]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:06 localhost python3[43643]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Oct 14 04:02:07 localhost python3[43661]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 04:02:07 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 14 04:02:07 localhost systemd[1]: Stopped Apply Kernel Variables. Oct 14 04:02:07 localhost systemd[1]: Stopping Apply Kernel Variables... Oct 14 04:02:07 localhost systemd[1]: Starting Apply Kernel Variables... Oct 14 04:02:07 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Oct 14 04:02:07 localhost systemd[1]: Finished Apply Kernel Variables. Oct 14 04:02:07 localhost python3[43682]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:08 localhost python3[43698]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:08 localhost python3[43714]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:08 localhost python3[43730]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:02:09 localhost python3[43746]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:09 localhost python3[43762]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:09 localhost python3[43778]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:09 localhost python3[43794]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:10 localhost python3[43810]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:10 localhost python3[43858]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:11 localhost python3[43901]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428930.546881-76693-100975331198765/source _original_basename=tmpzwz4r0en follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:11 localhost python3[43931]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:13 localhost python3[43948]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:13 localhost python3[43996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:13 localhost python3[44039]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428933.2463076-76862-240384122246892/source _original_basename=tmp6z9eewr1 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:14 localhost python3[44069]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:14 localhost python3[44085]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:15 localhost python3[44101]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:15 localhost python3[44117]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:15 localhost python3[44133]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:16 localhost python3[44163]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:16 localhost python3[44195]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:16 localhost python3[44236]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:16 localhost python3[44278]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:17 localhost python3[44327]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Oct 14 04:02:17 localhost python3[44349]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486733.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Oct 14 04:02:18 localhost python3[44388]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Oct 14 04:02:18 localhost python3[44404]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:18 localhost python3[44453]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:19 localhost python3[44496]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428938.6985376-77324-22224591842150/source _original_basename=tmpbrc4ikh5 follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:19 localhost python3[44526]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Oct 14 04:02:20 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=11 res=1 Oct 14 04:02:20 localhost python3[44546]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:21 localhost python3[44562]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Oct 14 04:02:22 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=12 res=1 Oct 14 04:02:22 localhost python3[44583]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:02:25 localhost python3[44600]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 04:02:26 localhost python3[44661]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:26 localhost python3[44677]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:27 localhost python3[44737]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:27 localhost python3[44780]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428947.0000236-77571-145352373061207/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=ea25f21a255459f512e5e01de4ea914d3ecdf5a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:28 localhost python3[44842]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:28 localhost python3[44887]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428948.1026218-77624-102166085656288/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:29 localhost python3[44917]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:29 localhost python3[44933]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:29 localhost python3[44949]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:30 localhost python3[44965]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:30 localhost python3[45013]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:31 localhost python3[45056]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428950.6498282-77794-268938116765287/source _original_basename=tmpfx54emqt follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:31 localhost python3[45086]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:32 localhost python3[45102]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:32 localhost python3[45118]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:02:36 localhost python3[45167]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:36 localhost python3[45212]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428955.9626298-78098-257873281496966/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:37 localhost python3[45243]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:02:37 localhost systemd[1]: Stopping OpenSSH server daemon... Oct 14 04:02:37 localhost systemd[1]: sshd.service: Deactivated successfully. Oct 14 04:02:37 localhost systemd[1]: Stopped OpenSSH server daemon. Oct 14 04:02:37 localhost systemd[1]: sshd.service: Consumed 1.933s CPU time, read 2.0M from disk, written 12.0K to disk. Oct 14 04:02:37 localhost systemd[1]: Stopped target sshd-keygen.target. Oct 14 04:02:37 localhost systemd[1]: Stopping sshd-keygen.target... Oct 14 04:02:37 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 04:02:37 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 04:02:37 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 04:02:37 localhost systemd[1]: Reached target sshd-keygen.target. Oct 14 04:02:37 localhost systemd[1]: Starting OpenSSH server daemon... Oct 14 04:02:37 localhost sshd[45247]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:02:37 localhost systemd[1]: Started OpenSSH server daemon. Oct 14 04:02:37 localhost python3[45263]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:38 localhost python3[45281]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:40 localhost python3[45299]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:02:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:02:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3258 writes, 16K keys, 3258 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3258 writes, 145 syncs, 22.47 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3258 writes, 16K keys, 3258 commit groups, 1.0 writes per commit group, ingest: 14.66 MB, 0.02 MB/s#012Interval WAL: 3258 writes, 145 syncs, 22.47 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Oct 14 04:02:44 localhost python3[45348]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:44 localhost python3[45366]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:45 localhost python3[45396]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:02:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:02:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3282 writes, 16K keys, 3282 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3282 writes, 155 syncs, 21.17 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3282 writes, 16K keys, 3282 commit groups, 1.0 writes per commit group, ingest: 14.82 MB, 0.02 MB/s#012Interval WAL: 3282 writes, 155 syncs, 21.17 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable Oct 14 04:02:45 localhost python3[45446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:45 localhost python3[45464]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:46 localhost python3[45494]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:02:46 localhost systemd[1]: Reloading. Oct 14 04:02:46 localhost systemd-rc-local-generator[45519]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:02:46 localhost systemd-sysv-generator[45524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:02:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:02:46 localhost systemd[1]: Starting chronyd online sources service... Oct 14 04:02:46 localhost chronyc[45534]: 200 OK Oct 14 04:02:46 localhost systemd[1]: chrony-online.service: Deactivated successfully. Oct 14 04:02:46 localhost systemd[1]: Finished chronyd online sources service. Oct 14 04:02:47 localhost python3[45550]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:47 localhost chronyd[25772]: System clock was stepped by -0.000025 seconds Oct 14 04:02:47 localhost python3[45567]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:48 localhost python3[45584]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:48 localhost chronyd[25772]: System clock was stepped by 0.000000 seconds Oct 14 04:02:48 localhost python3[45601]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:49 localhost python3[45618]: ansible-timezone Invoked with name=UTC hwclock=None Oct 14 04:02:49 localhost systemd[1]: Starting Time & Date Service... Oct 14 04:02:49 localhost systemd[1]: Started Time & Date Service. Oct 14 04:02:50 localhost python3[45638]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:50 localhost python3[45655]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:51 localhost python3[45672]: ansible-slurp Invoked with src=/etc/tuned/active_profile Oct 14 04:02:51 localhost python3[45688]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:02:52 localhost python3[45704]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:52 localhost python3[45720]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:53 localhost python3[45768]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:53 localhost python3[45811]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428973.0634289-79137-175222082218747/source _original_basename=tmp9v4qgh5n follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:54 localhost python3[45873]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:54 localhost python3[45916]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428973.8941565-79177-104590529011402/source _original_basename=tmpaf9_tlp5 follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:55 localhost python3[45946]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Oct 14 04:02:55 localhost systemd[1]: Reloading. Oct 14 04:02:55 localhost systemd-sysv-generator[45975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:02:55 localhost systemd-rc-local-generator[45970]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:02:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:02:55 localhost python3[45999]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:56 localhost python3[46015]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:56 localhost python3[46032]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:02:56 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Oct 14 04:02:56 localhost python3[46049]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:02:57 localhost python3[46065]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:02:57 localhost python3[46113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:02:58 localhost python3[46156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760428977.3749747-79358-181503134889263/source _original_basename=tmpfcuh7zug follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:03:19 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Oct 14 04:03:22 localhost python3[46265]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:03:23 localhost python3[46281]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Oct 14 04:03:23 localhost python3[46297]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:03:24 localhost python3[46313]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:03:24 localhost python3[46329]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:03:24 localhost python3[46345]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Oct 14 04:03:25 localhost kernel: SELinux: Converting 2706 SID table entries... Oct 14 04:03:25 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 04:03:25 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 04:03:25 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 04:03:25 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 04:03:25 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 04:03:25 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 04:03:25 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 04:03:25 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=13 res=1 Oct 14 04:03:26 localhost python3[46367]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:03:28 localhost python3[46504]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Oct 14 04:03:28 localhost rsyslogd[759]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Oct 14 04:03:28 localhost python3[46520]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:03:29 localhost python3[46536]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:03:29 localhost python3[46552]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Oct 14 04:03:30 localhost systemd[35868]: Created slice User Background Tasks Slice. Oct 14 04:03:30 localhost systemd[35868]: Starting Cleanup of User's Temporary Files and Directories... Oct 14 04:03:30 localhost systemd[35868]: Finished Cleanup of User's Temporary Files and Directories. Oct 14 04:03:35 localhost python3[46601]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:03:35 localhost python3[46644]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429014.9410305-80933-49813366267958/source _original_basename=tmpq3jg1sxr follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:03:36 localhost python3[46674]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:03:38 localhost python3[46797]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:03:40 localhost python3[46918]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Oct 14 04:03:42 localhost python3[46934]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:03:43 localhost python3[46951]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:03:47 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Oct 14 04:03:47 localhost dbus-broker-launch[15704]: Noticed file-system modification, trigger reload. Oct 14 04:03:47 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Oct 14 04:03:47 localhost dbus-broker-launch[15704]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Oct 14 04:03:47 localhost dbus-broker-launch[15704]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Oct 14 04:03:47 localhost systemd[1]: Reexecuting. Oct 14 04:03:47 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Oct 14 04:03:47 localhost systemd[1]: Detected virtualization kvm. Oct 14 04:03:47 localhost systemd[1]: Detected architecture x86-64. Oct 14 04:03:47 localhost systemd-rc-local-generator[47003]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:03:47 localhost systemd-sysv-generator[47008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:03:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:03:56 localhost kernel: SELinux: Converting 2706 SID table entries... Oct 14 04:03:56 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 04:03:56 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 04:03:56 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 04:03:56 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 04:03:56 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 04:03:56 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 04:03:56 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 04:03:56 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Oct 14 04:03:56 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=14 res=1 Oct 14 04:03:57 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 04:03:57 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 04:03:57 localhost systemd[1]: Reloading. Oct 14 04:03:57 localhost systemd-sysv-generator[47125]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:03:57 localhost systemd-rc-local-generator[47122]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:03:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:03:57 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 04:03:57 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 04:03:57 localhost systemd[1]: Stopping Journal Service... Oct 14 04:03:57 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Oct 14 04:03:57 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Oct 14 04:03:57 localhost systemd-journald[618]: Journal stopped Oct 14 04:03:57 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Oct 14 04:03:57 localhost systemd[1]: Stopped Journal Service. Oct 14 04:03:57 localhost systemd[1]: systemd-journald.service: Consumed 1.798s CPU time. Oct 14 04:03:57 localhost systemd[1]: Starting Journal Service... Oct 14 04:03:57 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 14 04:03:57 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Oct 14 04:03:57 localhost systemd[1]: systemd-udevd.service: Consumed 2.996s CPU time. Oct 14 04:03:57 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Oct 14 04:03:57 localhost systemd-journald[47488]: Journal started Oct 14 04:03:57 localhost systemd-journald[47488]: Runtime Journal (/run/log/journal/8e1d5208cffec42b50976967e1d1cfd0) is 12.1M, max 314.7M, 302.6M free. Oct 14 04:03:57 localhost systemd[1]: Started Journal Service. Oct 14 04:03:57 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Oct 14 04:03:57 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 04:03:57 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 04:03:57 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 04:03:57 localhost systemd-udevd[47493]: Using default interface naming scheme 'rhel-9.0'. Oct 14 04:03:57 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Oct 14 04:03:58 localhost systemd[1]: Reloading. Oct 14 04:03:58 localhost systemd-rc-local-generator[48097]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:03:58 localhost systemd-sysv-generator[48102]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:03:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:03:58 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 04:03:58 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 04:03:58 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 04:03:58 localhost systemd[1]: man-db-cache-update.service: Consumed 1.329s CPU time. Oct 14 04:03:58 localhost systemd[1]: run-re4782dd3434b4ca3afaebd5a3a40093b.service: Deactivated successfully. Oct 14 04:03:58 localhost systemd[1]: run-r97ec7082d24c49e9b0cda5716ece5568.service: Deactivated successfully. Oct 14 04:04:00 localhost python3[48438]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Oct 14 04:04:00 localhost python3[48457]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:04:01 localhost python3[48475]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:04:01 localhost python3[48475]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Oct 14 04:04:01 localhost python3[48475]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Oct 14 04:04:09 localhost podman[48487]: 2025-10-14 08:04:01.926954778 +0000 UTC m=+0.049514650 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Oct 14 04:04:09 localhost python3[48475]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 1571c200d626c35388c5864f613dd17fb1618f6192fe622da60a47fa61763c46 --format json Oct 14 04:04:10 localhost python3[48634]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:04:10 localhost python3[48634]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Oct 14 04:04:10 localhost python3[48634]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Oct 14 04:04:18 localhost podman[48647]: 2025-10-14 08:04:10.206972317 +0000 UTC m=+0.038057457 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Oct 14 04:04:18 localhost python3[48634]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 1e3eee8f9b979ec527f69dda079bc969bf9ddbe65c90f0543f3891d72e56a75e --format json Oct 14 04:04:19 localhost python3[48806]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:04:19 localhost python3[48806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Oct 14 04:04:19 localhost python3[48806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Oct 14 04:04:21 localhost podman[48933]: 2025-10-14 08:04:21.229819233 +0000 UTC m=+0.081407354 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, release=553, RELEASE=main, ceph=True, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55) Oct 14 04:04:21 localhost podman[48933]: 2025-10-14 08:04:21.356985788 +0000 UTC m=+0.208573979 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=) Oct 14 04:04:37 localhost podman[48819]: 2025-10-14 08:04:19.323219988 +0000 UTC m=+0.046450554 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:04:37 localhost python3[48806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a56a2196ea2290002b5e3e60b4c440f2326e4f1173ca4d9c0a320716a756e568 --format json Oct 14 04:04:37 localhost python3[49440]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:04:37 localhost python3[49440]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Oct 14 04:04:37 localhost python3[49440]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Oct 14 04:04:54 localhost podman[49453]: 2025-10-14 08:04:37.569001122 +0000 UTC m=+0.042714596 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:04:54 localhost python3[49440]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 89ed729ad5d881399a0bbd370b8f3c39b84e5a87c6e02b0d1f2c943d2d9cfb7a --format json Oct 14 04:04:55 localhost python3[49900]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:04:55 localhost python3[49900]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Oct 14 04:04:55 localhost python3[49900]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Oct 14 04:05:01 localhost podman[49914]: 2025-10-14 08:04:55.203998026 +0000 UTC m=+0.041178906 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Oct 14 04:05:01 localhost python3[49900]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a5e44a6280ab7a1da1b469cc214b40ecdad1d13f0c37c24f32cb45b40cce41d6 --format json Oct 14 04:05:01 localhost python3[50244]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:05:01 localhost python3[50244]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Oct 14 04:05:01 localhost python3[50244]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Oct 14 04:05:06 localhost podman[50256]: 2025-10-14 08:05:01.757178705 +0000 UTC m=+0.044659588 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Oct 14 04:05:06 localhost python3[50244]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ef4308e71ba3950618e5de99f6c775558514a06fb9f6d93ca5c54d685a1349a6 --format json Oct 14 04:05:07 localhost python3[50380]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:05:07 localhost python3[50380]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Oct 14 04:05:07 localhost python3[50380]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Oct 14 04:05:10 localhost podman[50392]: 2025-10-14 08:05:07.365220824 +0000 UTC m=+0.032387509 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Oct 14 04:05:10 localhost python3[50380]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 5b5e3dbf480a168d795a47e53d0695cd833f381ef10119a3de87e5946f6b53e5 --format json Oct 14 04:05:10 localhost python3[50516]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:05:10 localhost python3[50516]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Oct 14 04:05:10 localhost python3[50516]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Oct 14 04:05:14 localhost podman[50528]: 2025-10-14 08:05:10.977783976 +0000 UTC m=+0.047130148 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Oct 14 04:05:14 localhost python3[50516]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 250768c493b95c1151e047902a648e6659ba35adb4c6e0af85c231937d0cc9b7 --format json Oct 14 04:05:15 localhost python3[50652]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:05:15 localhost python3[50652]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Oct 14 04:05:15 localhost python3[50652]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Oct 14 04:05:18 localhost podman[50665]: 2025-10-14 08:05:15.135004892 +0000 UTC m=+0.045137755 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Oct 14 04:05:18 localhost python3[50652]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 68d3d3a77bfc9fce94ca9ce2b28076450b851f6f1e82e97fbe356ce4ab0f7849 --format json Oct 14 04:05:18 localhost python3[50789]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:05:18 localhost python3[50789]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Oct 14 04:05:18 localhost python3[50789]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Oct 14 04:05:24 localhost podman[50801]: 2025-10-14 08:05:18.888987117 +0000 UTC m=+0.043746510 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Oct 14 04:05:24 localhost python3[50789]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 01fc8d861e2b923ef0bf1d5c40a269bd976b00e8a31e8c56d63f3504b82b1c76 --format json Oct 14 04:05:24 localhost python3[50936]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Oct 14 04:05:24 localhost python3[50936]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Oct 14 04:05:24 localhost python3[50936]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Oct 14 04:05:27 localhost podman[50949]: 2025-10-14 08:05:24.830449214 +0000 UTC m=+0.042164181 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Oct 14 04:05:27 localhost python3[50936]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7f7fcb1a516a6191c7a8cb132a460e04d50ca4381f114f08dcbfe84340e49ac0 --format json Oct 14 04:05:28 localhost python3[51147]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:05:30 localhost ansible-async_wrapper.py[51319]: Invoked with 344386798795 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429129.606731-83884-180812856246561/AnsiballZ_command.py _ Oct 14 04:05:30 localhost ansible-async_wrapper.py[51322]: Starting module and watcher Oct 14 04:05:30 localhost ansible-async_wrapper.py[51322]: Start watching 51323 (3600) Oct 14 04:05:30 localhost ansible-async_wrapper.py[51323]: Start module (51323) Oct 14 04:05:30 localhost ansible-async_wrapper.py[51319]: Return async_wrapper task started. Oct 14 04:05:30 localhost python3[51343]: ansible-ansible.legacy.async_status Invoked with jid=344386798795.51319 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:05:34 localhost puppet-user[51327]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:34 localhost puppet-user[51327]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:34 localhost puppet-user[51327]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:34 localhost puppet-user[51327]: (file & line not available) Oct 14 04:05:34 localhost puppet-user[51327]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:34 localhost puppet-user[51327]: (file & line not available) Oct 14 04:05:34 localhost puppet-user[51327]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Oct 14 04:05:34 localhost puppet-user[51327]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Oct 14 04:05:34 localhost puppet-user[51327]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.13 seconds Oct 14 04:05:34 localhost puppet-user[51327]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Oct 14 04:05:34 localhost puppet-user[51327]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Oct 14 04:05:34 localhost puppet-user[51327]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Oct 14 04:05:34 localhost puppet-user[51327]: Notice: Applied catalog in 0.05 seconds Oct 14 04:05:34 localhost puppet-user[51327]: Application: Oct 14 04:05:34 localhost puppet-user[51327]: Initial environment: production Oct 14 04:05:34 localhost puppet-user[51327]: Converged environment: production Oct 14 04:05:34 localhost puppet-user[51327]: Run mode: user Oct 14 04:05:34 localhost puppet-user[51327]: Changes: Oct 14 04:05:34 localhost puppet-user[51327]: Total: 3 Oct 14 04:05:34 localhost puppet-user[51327]: Events: Oct 14 04:05:34 localhost puppet-user[51327]: Success: 3 Oct 14 04:05:34 localhost puppet-user[51327]: Total: 3 Oct 14 04:05:34 localhost puppet-user[51327]: Resources: Oct 14 04:05:34 localhost puppet-user[51327]: Changed: 3 Oct 14 04:05:34 localhost puppet-user[51327]: Out of sync: 3 Oct 14 04:05:34 localhost puppet-user[51327]: Total: 10 Oct 14 04:05:34 localhost puppet-user[51327]: Time: Oct 14 04:05:34 localhost puppet-user[51327]: Schedule: 0.00 Oct 14 04:05:34 localhost puppet-user[51327]: File: 0.00 Oct 14 04:05:34 localhost puppet-user[51327]: Exec: 0.02 Oct 14 04:05:34 localhost puppet-user[51327]: Augeas: 0.02 Oct 14 04:05:34 localhost puppet-user[51327]: Transaction evaluation: 0.05 Oct 14 04:05:34 localhost puppet-user[51327]: Catalog application: 0.05 Oct 14 04:05:34 localhost puppet-user[51327]: Config retrieval: 0.17 Oct 14 04:05:34 localhost puppet-user[51327]: Last run: 1760429134 Oct 14 04:05:34 localhost puppet-user[51327]: Filebucket: 0.00 Oct 14 04:05:34 localhost puppet-user[51327]: Total: 0.05 Oct 14 04:05:34 localhost puppet-user[51327]: Version: Oct 14 04:05:34 localhost puppet-user[51327]: Config: 1760429134 Oct 14 04:05:34 localhost puppet-user[51327]: Puppet: 7.10.0 Oct 14 04:05:34 localhost ansible-async_wrapper.py[51323]: Module complete (51323) Oct 14 04:05:35 localhost ansible-async_wrapper.py[51322]: Done in kid B. Oct 14 04:05:40 localhost python3[51474]: ansible-ansible.legacy.async_status Invoked with jid=344386798795.51319 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:05:41 localhost python3[51490]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:05:42 localhost python3[51506]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:05:42 localhost python3[51554]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:05:42 localhost python3[51597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429142.2292578-84135-8478402181633/source _original_basename=tmp22pcgtml follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:05:43 localhost python3[51627]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:05:44 localhost python3[51730]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Oct 14 04:05:45 localhost python3[51749]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 04:05:45 localhost python3[51765]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005486733 step=1 update_config_hash_only=False Oct 14 04:05:46 localhost python3[51781]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:05:46 localhost python3[51797]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Oct 14 04:05:47 localhost python3[51813]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 04:05:48 localhost python3[51855]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Oct 14 04:05:48 localhost podman[52008]: 2025-10-14 08:05:48.980740188 +0000 UTC m=+0.096144513 container create e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, io.openshift.expose-services=, container_name=container-puppet-collectd, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, release=2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_puppet_step1, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true) Oct 14 04:05:48 localhost podman[52042]: 2025-10-14 08:05:48.998293642 +0000 UTC m=+0.082346138 container create e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_puppet_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:07:59) Oct 14 04:05:49 localhost systemd[1]: Started libpod-conmon-e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba.scope. Oct 14 04:05:49 localhost systemd[1]: Started libpod-conmon-e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a.scope. Oct 14 04:05:49 localhost podman[52008]: 2025-10-14 08:05:48.942894444 +0000 UTC m=+0.058298749 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Oct 14 04:05:49 localhost podman[52056]: 2025-10-14 08:05:49.050951463 +0000 UTC m=+0.103995101 container create 7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, version=17.1.9, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=container-puppet-crond, architecture=x86_64) Oct 14 04:05:49 localhost podman[52042]: 2025-10-14 08:05:48.952726035 +0000 UTC m=+0.036778531 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Oct 14 04:05:49 localhost systemd[1]: Started libcrun container. Oct 14 04:05:49 localhost systemd[1]: Started libcrun container. Oct 14 04:05:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66141e0355e434a1428da5b2027ef6192344d1c6afa950636647476e8925671b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1668dbabecea61a717977938a99d4a46ffa99afa4505047a6e5a86838675946/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:49 localhost podman[52061]: 2025-10-14 08:05:48.980016206 +0000 UTC m=+0.031302609 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Oct 14 04:05:49 localhost podman[52056]: 2025-10-14 08:05:48.99123384 +0000 UTC m=+0.044277478 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Oct 14 04:05:49 localhost podman[52066]: 2025-10-14 08:05:49.008590327 +0000 UTC m=+0.044369791 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:05:49 localhost podman[52066]: 2025-10-14 08:05:49.109527301 +0000 UTC m=+0.145306725 container create 4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=container-puppet-nova_libvirt, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, io.buildah.version=1.33.12, release=2, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2) Oct 14 04:05:49 localhost systemd[1]: Started libpod-conmon-7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c.scope. Oct 14 04:05:49 localhost systemd[1]: Started libpod-conmon-4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb.scope. Oct 14 04:05:49 localhost podman[52008]: 2025-10-14 08:05:49.129437039 +0000 UTC m=+0.244841354 container init e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-type=git, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, release=2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Oct 14 04:05:49 localhost systemd[1]: Started libcrun container. Oct 14 04:05:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/279af57fe640a81799041eadaf076a38dc293fb9fa2d8ceac0fa223bb06cffab/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:49 localhost systemd[1]: Started libcrun container. Oct 14 04:05:49 localhost podman[52008]: 2025-10-14 08:05:49.141624933 +0000 UTC m=+0.257029228 container start e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, container_name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Oct 14 04:05:49 localhost podman[52008]: 2025-10-14 08:05:49.144155193 +0000 UTC m=+0.259559698 container attach e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, container_name=container-puppet-collectd, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, version=17.1.9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:05:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/705239d69edbb97c498d74570c74cd8434e37024eb25add7e89f19b22fc90898/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:49 localhost podman[52056]: 2025-10-14 08:05:49.148300584 +0000 UTC m=+0.201344202 container init 7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T13:07:52, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_puppet_step1, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=container-puppet-crond, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible) Oct 14 04:05:49 localhost podman[52066]: 2025-10-14 08:05:49.156276746 +0000 UTC m=+0.192056160 container init 4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, io.openshift.expose-services=, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=container-puppet-nova_libvirt, tcib_managed=true, build-date=2025-07-21T14:56:59, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, release=2, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, name=rhosp17/openstack-nova-libvirt, version=17.1.9) Oct 14 04:05:49 localhost podman[52061]: 2025-10-14 08:05:49.161556841 +0000 UTC m=+0.212843224 container create acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_puppet_step1, release=1) Oct 14 04:05:49 localhost systemd[1]: Started libpod-conmon-acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8.scope. Oct 14 04:05:49 localhost systemd[1]: Started libcrun container. Oct 14 04:05:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02ae85124e4959ea5e505d3d23ffa956e944453d3a9644fd9b15fb0c07f7fbc0/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02ae85124e4959ea5e505d3d23ffa956e944453d3a9644fd9b15fb0c07f7fbc0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:50 localhost podman[52042]: 2025-10-14 08:05:50.121030167 +0000 UTC m=+1.205082663 container init e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, container_name=container-puppet-metrics_qdr, release=1, tcib_managed=true, distribution-scope=public) Oct 14 04:05:50 localhost systemd[1]: tmp-crun.Zuejh7.mount: Deactivated successfully. Oct 14 04:05:50 localhost podman[52061]: 2025-10-14 08:05:50.155173144 +0000 UTC m=+1.206459547 container init acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, container_name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.component=openstack-iscsid-container, tcib_managed=true) Oct 14 04:05:50 localhost systemd[1]: tmp-crun.NEhEVY.mount: Deactivated successfully. Oct 14 04:05:50 localhost podman[52056]: 2025-10-14 08:05:50.16549749 +0000 UTC m=+1.218541148 container start 7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, container_name=container-puppet-crond, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:05:50 localhost podman[52056]: 2025-10-14 08:05:50.166045497 +0000 UTC m=+1.219089225 container attach 7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, release=1, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Oct 14 04:05:50 localhost podman[52066]: 2025-10-14 08:05:50.170162717 +0000 UTC m=+1.205942171 container start 4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=container-puppet-nova_libvirt, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public) Oct 14 04:05:50 localhost podman[52066]: 2025-10-14 08:05:50.170476178 +0000 UTC m=+1.206255632 container attach 4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, container_name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, io.openshift.expose-services=, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:05:50 localhost podman[52061]: 2025-10-14 08:05:50.172716377 +0000 UTC m=+1.224002800 container start acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, release=1, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Oct 14 04:05:50 localhost podman[52061]: 2025-10-14 08:05:50.172954865 +0000 UTC m=+1.224241288 container attach acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, vcs-type=git, container_name=container-puppet-iscsid, tcib_managed=true, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team) Oct 14 04:05:50 localhost podman[52042]: 2025-10-14 08:05:50.184610373 +0000 UTC m=+1.268662929 container start e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.expose-services=, version=17.1.9, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, config_id=tripleo_puppet_step1, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Oct 14 04:05:50 localhost podman[52042]: 2025-10-14 08:05:50.185035937 +0000 UTC m=+1.269088513 container attach e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, tcib_managed=true, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:05:51 localhost ovs-vsctl[52349]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Oct 14 04:05:51 localhost puppet-user[52190]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:51 localhost puppet-user[52190]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:51 localhost puppet-user[52190]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:51 localhost puppet-user[52190]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52190]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:52 localhost puppet-user[52190]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52177]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:52 localhost puppet-user[52177]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:52 localhost puppet-user[52177]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:52 localhost puppet-user[52177]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52173]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:52 localhost puppet-user[52173]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:52 localhost puppet-user[52173]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:52 localhost puppet-user[52173]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52177]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:52 localhost puppet-user[52177]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52190]: Notice: Accepting previously invalid value for target type 'Integer' Oct 14 04:05:52 localhost puppet-user[52173]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:52 localhost puppet-user[52173]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:52 localhost puppet-user[52175]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:52 localhost puppet-user[52175]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52190]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.13 seconds Oct 14 04:05:52 localhost puppet-user[52173]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.08 seconds Oct 14 04:05:52 localhost puppet-user[52175]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:52 localhost puppet-user[52175]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52173]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}43e138b8e1b278bb76322abe2e4719a94d31f53a9d747f5d70206ca6126c71d6' Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Oct 14 04:05:52 localhost puppet-user[52173]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Oct 14 04:05:52 localhost puppet-user[52190]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Oct 14 04:05:52 localhost puppet-user[52190]: Notice: Applied catalog in 0.03 seconds Oct 14 04:05:52 localhost puppet-user[52190]: Application: Oct 14 04:05:52 localhost puppet-user[52190]: Initial environment: production Oct 14 04:05:52 localhost puppet-user[52190]: Converged environment: production Oct 14 04:05:52 localhost puppet-user[52190]: Run mode: user Oct 14 04:05:52 localhost puppet-user[52190]: Changes: Oct 14 04:05:52 localhost puppet-user[52190]: Total: 7 Oct 14 04:05:52 localhost puppet-user[52190]: Events: Oct 14 04:05:52 localhost puppet-user[52190]: Success: 7 Oct 14 04:05:52 localhost puppet-user[52190]: Total: 7 Oct 14 04:05:52 localhost puppet-user[52190]: Resources: Oct 14 04:05:52 localhost puppet-user[52190]: Skipped: 13 Oct 14 04:05:52 localhost puppet-user[52190]: Changed: 5 Oct 14 04:05:52 localhost puppet-user[52190]: Out of sync: 5 Oct 14 04:05:52 localhost puppet-user[52190]: Total: 20 Oct 14 04:05:52 localhost puppet-user[52190]: Time: Oct 14 04:05:52 localhost puppet-user[52190]: File: 0.01 Oct 14 04:05:52 localhost puppet-user[52190]: Transaction evaluation: 0.02 Oct 14 04:05:52 localhost puppet-user[52190]: Catalog application: 0.03 Oct 14 04:05:52 localhost puppet-user[52190]: Config retrieval: 0.16 Oct 14 04:05:52 localhost puppet-user[52190]: Last run: 1760429152 Oct 14 04:05:52 localhost puppet-user[52190]: Total: 0.03 Oct 14 04:05:52 localhost puppet-user[52190]: Version: Oct 14 04:05:52 localhost puppet-user[52190]: Config: 1760429151 Oct 14 04:05:52 localhost puppet-user[52190]: Puppet: 7.10.0 Oct 14 04:05:52 localhost puppet-user[52196]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:52 localhost puppet-user[52196]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:52 localhost puppet-user[52196]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:52 localhost puppet-user[52196]: (file & line not available) Oct 14 04:05:52 localhost puppet-user[52173]: Notice: Applied catalog in 0.06 seconds Oct 14 04:05:52 localhost puppet-user[52173]: Application: Oct 14 04:05:52 localhost puppet-user[52173]: Initial environment: production Oct 14 04:05:52 localhost puppet-user[52173]: Converged environment: production Oct 14 04:05:52 localhost puppet-user[52173]: Run mode: user Oct 14 04:05:52 localhost puppet-user[52173]: Changes: Oct 14 04:05:52 localhost puppet-user[52173]: Total: 2 Oct 14 04:05:52 localhost puppet-user[52173]: Events: Oct 14 04:05:52 localhost puppet-user[52173]: Success: 2 Oct 14 04:05:52 localhost puppet-user[52173]: Total: 2 Oct 14 04:05:52 localhost puppet-user[52173]: Resources: Oct 14 04:05:52 localhost puppet-user[52173]: Changed: 2 Oct 14 04:05:52 localhost puppet-user[52173]: Out of sync: 2 Oct 14 04:05:52 localhost puppet-user[52173]: Skipped: 7 Oct 14 04:05:52 localhost puppet-user[52173]: Total: 9 Oct 14 04:05:52 localhost puppet-user[52173]: Time: Oct 14 04:05:52 localhost puppet-user[52173]: File: 0.01 Oct 14 04:05:52 localhost puppet-user[52173]: Cron: 0.02 Oct 14 04:05:52 localhost puppet-user[52173]: Transaction evaluation: 0.05 Oct 14 04:05:52 localhost puppet-user[52173]: Catalog application: 0.06 Oct 14 04:05:52 localhost puppet-user[52173]: Config retrieval: 0.11 Oct 14 04:05:52 localhost puppet-user[52173]: Last run: 1760429152 Oct 14 04:05:52 localhost puppet-user[52173]: Total: 0.06 Oct 14 04:05:52 localhost puppet-user[52173]: Version: Oct 14 04:05:52 localhost puppet-user[52173]: Config: 1760429152 Oct 14 04:05:52 localhost puppet-user[52173]: Puppet: 7.10.0 Oct 14 04:05:52 localhost puppet-user[52196]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:52 localhost puppet-user[52196]: (file & line not available) Oct 14 04:05:52 localhost podman[51939]: 2025-10-14 08:05:48.837510841 +0000 UTC m=+0.034396877 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Oct 14 04:05:52 localhost puppet-user[52196]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.11 seconds Oct 14 04:05:52 localhost puppet-user[52196]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Oct 14 04:05:52 localhost puppet-user[52196]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Oct 14 04:05:52 localhost puppet-user[52175]: in a future release. Use nova::cinder::os_region_name instead Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Oct 14 04:05:52 localhost puppet-user[52175]: in a future release. Use nova::cinder::catalog_info instead Oct 14 04:05:52 localhost puppet-user[52196]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Oct 14 04:05:52 localhost puppet-user[52177]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.38 seconds Oct 14 04:05:52 localhost podman[52645]: 2025-10-14 08:05:52.421557375 +0000 UTC m=+0.086510170 container create d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, com.redhat.component=openstack-ceilometer-central-container, config_id=tripleo_puppet_step1, build-date=2025-07-21T14:49:23, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.33.12, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ceilometer-central, version=17.1.9) Oct 14 04:05:52 localhost systemd[1]: libpod-e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a.scope: Deactivated successfully. Oct 14 04:05:52 localhost systemd[1]: libpod-e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a.scope: Consumed 2.118s CPU time. Oct 14 04:05:52 localhost systemd[1]: Started libpod-conmon-d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57.scope. Oct 14 04:05:52 localhost systemd[1]: Started libcrun container. Oct 14 04:05:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/215025152e7486dca6aa506e7e941c98eca167be4a4853b2a3771ef4f2b39afc/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:52 localhost podman[52645]: 2025-10-14 08:05:52.376814263 +0000 UTC m=+0.041767038 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Oct 14 04:05:52 localhost systemd[1]: libpod-7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c.scope: Deactivated successfully. Oct 14 04:05:52 localhost systemd[1]: libpod-7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c.scope: Consumed 2.157s CPU time. Oct 14 04:05:52 localhost podman[52645]: 2025-10-14 08:05:52.480753382 +0000 UTC m=+0.145706187 container init d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, io.openshift.expose-services=, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.9, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, release=1, name=rhosp17/openstack-ceilometer-central, batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T14:49:23, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, architecture=x86_64) Oct 14 04:05:52 localhost podman[52056]: 2025-10-14 08:05:52.481487205 +0000 UTC m=+3.534530853 container died 7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T13:07:52, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Oct 14 04:05:52 localhost podman[52708]: 2025-10-14 08:05:52.507722433 +0000 UTC m=+0.041396467 container died e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, name=rhosp17/openstack-qdrouterd, config_id=tripleo_puppet_step1, container_name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Oct 14 04:05:52 localhost podman[52708]: 2025-10-14 08:05:52.671926012 +0000 UTC m=+0.205600046 container cleanup e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=container-puppet-metrics_qdr, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, version=17.1.9, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, maintainer=OpenStack TripleO Team) Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Oct 14 04:05:52 localhost systemd[1]: libpod-conmon-e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a.scope: Deactivated successfully. Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Oct 14 04:05:52 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}8dd3769945b86c38433504b97f7851a931eb3c94b667298d10a9796a3d020595' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Oct 14 04:05:52 localhost podman[52722]: 2025-10-14 08:05:52.734647811 +0000 UTC m=+0.243720329 container cleanup 7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, container_name=container-puppet-crond, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Oct 14 04:05:52 localhost puppet-user[52177]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Oct 14 04:05:52 localhost systemd[1]: libpod-conmon-7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c.scope: Deactivated successfully. Oct 14 04:05:52 localhost puppet-user[52177]: Notice: Applied catalog in 0.26 seconds Oct 14 04:05:52 localhost puppet-user[52177]: Application: Oct 14 04:05:52 localhost puppet-user[52177]: Initial environment: production Oct 14 04:05:52 localhost puppet-user[52177]: Converged environment: production Oct 14 04:05:52 localhost puppet-user[52177]: Run mode: user Oct 14 04:05:52 localhost puppet-user[52177]: Changes: Oct 14 04:05:52 localhost puppet-user[52177]: Total: 43 Oct 14 04:05:52 localhost puppet-user[52177]: Events: Oct 14 04:05:52 localhost puppet-user[52177]: Success: 43 Oct 14 04:05:52 localhost puppet-user[52177]: Total: 43 Oct 14 04:05:52 localhost puppet-user[52177]: Resources: Oct 14 04:05:52 localhost puppet-user[52177]: Skipped: 14 Oct 14 04:05:52 localhost puppet-user[52177]: Changed: 38 Oct 14 04:05:52 localhost puppet-user[52177]: Out of sync: 38 Oct 14 04:05:52 localhost puppet-user[52177]: Total: 82 Oct 14 04:05:52 localhost puppet-user[52177]: Time: Oct 14 04:05:52 localhost puppet-user[52177]: File: 0.13 Oct 14 04:05:52 localhost puppet-user[52177]: Transaction evaluation: 0.25 Oct 14 04:05:52 localhost puppet-user[52177]: Catalog application: 0.26 Oct 14 04:05:52 localhost puppet-user[52177]: Config retrieval: 0.49 Oct 14 04:05:52 localhost puppet-user[52177]: Last run: 1760429152 Oct 14 04:05:52 localhost puppet-user[52177]: Concat fragment: 0.00 Oct 14 04:05:52 localhost puppet-user[52177]: Concat file: 0.00 Oct 14 04:05:52 localhost puppet-user[52177]: Total: 0.26 Oct 14 04:05:52 localhost puppet-user[52177]: Version: Oct 14 04:05:52 localhost puppet-user[52177]: Config: 1760429152 Oct 14 04:05:52 localhost puppet-user[52177]: Puppet: 7.10.0 Oct 14 04:05:52 localhost podman[52645]: 2025-10-14 08:05:52.747664002 +0000 UTC m=+0.412616807 container start d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2025-07-21T14:49:23, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, release=1, version=17.1.9, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, container_name=container-puppet-ceilometer, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, name=rhosp17/openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central) Oct 14 04:05:52 localhost podman[52645]: 2025-10-14 08:05:52.748143917 +0000 UTC m=+0.413096712 container attach d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-central, config_id=tripleo_puppet_step1, version=17.1.9, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, build-date=2025-07-21T14:49:23, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, release=1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:05:52 localhost puppet-user[52175]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Oct 14 04:05:52 localhost puppet-user[52196]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Oct 14 04:05:52 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Oct 14 04:05:52 localhost puppet-user[52196]: Notice: Applied catalog in 0.53 seconds Oct 14 04:05:52 localhost puppet-user[52196]: Application: Oct 14 04:05:52 localhost puppet-user[52196]: Initial environment: production Oct 14 04:05:52 localhost puppet-user[52196]: Converged environment: production Oct 14 04:05:52 localhost puppet-user[52196]: Run mode: user Oct 14 04:05:52 localhost puppet-user[52196]: Changes: Oct 14 04:05:52 localhost puppet-user[52196]: Total: 4 Oct 14 04:05:52 localhost puppet-user[52196]: Events: Oct 14 04:05:52 localhost puppet-user[52196]: Success: 4 Oct 14 04:05:52 localhost puppet-user[52196]: Total: 4 Oct 14 04:05:52 localhost puppet-user[52196]: Resources: Oct 14 04:05:52 localhost puppet-user[52196]: Changed: 4 Oct 14 04:05:52 localhost puppet-user[52196]: Out of sync: 4 Oct 14 04:05:52 localhost puppet-user[52196]: Skipped: 8 Oct 14 04:05:52 localhost puppet-user[52196]: Total: 13 Oct 14 04:05:52 localhost puppet-user[52196]: Time: Oct 14 04:05:52 localhost puppet-user[52196]: File: 0.00 Oct 14 04:05:52 localhost puppet-user[52196]: Exec: 0.07 Oct 14 04:05:52 localhost puppet-user[52196]: Config retrieval: 0.14 Oct 14 04:05:52 localhost puppet-user[52196]: Augeas: 0.43 Oct 14 04:05:52 localhost puppet-user[52196]: Transaction evaluation: 0.53 Oct 14 04:05:52 localhost puppet-user[52196]: Catalog application: 0.53 Oct 14 04:05:52 localhost puppet-user[52196]: Last run: 1760429152 Oct 14 04:05:52 localhost puppet-user[52196]: Total: 0.53 Oct 14 04:05:52 localhost puppet-user[52196]: Version: Oct 14 04:05:52 localhost puppet-user[52196]: Config: 1760429152 Oct 14 04:05:52 localhost puppet-user[52196]: Puppet: 7.10.0 Oct 14 04:05:53 localhost systemd[1]: libpod-acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8.scope: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: libpod-acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8.scope: Consumed 2.693s CPU time. Oct 14 04:05:53 localhost podman[52061]: 2025-10-14 08:05:53.100938156 +0000 UTC m=+4.152224549 container died acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:05:53 localhost podman[52867]: 2025-10-14 08:05:53.116222437 +0000 UTC m=+0.087484350 container create 6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, architecture=x86_64, build-date=2025-07-21T12:58:40, io.buildah.version=1.33.12, container_name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Oct 14 04:05:53 localhost systemd[1]: Started libpod-conmon-6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c.scope. Oct 14 04:05:53 localhost systemd[1]: libpod-e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba.scope: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: libpod-e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba.scope: Consumed 2.764s CPU time. Oct 14 04:05:53 localhost podman[52008]: 2025-10-14 08:05:53.145857503 +0000 UTC m=+4.261261808 container died e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, tcib_managed=true, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, distribution-scope=public, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12) Oct 14 04:05:53 localhost systemd[1]: Started libcrun container. Oct 14 04:05:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edfe04f9a55df360a7193df528cae2e7de5655e253d67fa05d14b0067469a682/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:53 localhost podman[52867]: 2025-10-14 08:05:53.057391012 +0000 UTC m=+0.028652945 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Oct 14 04:05:53 localhost podman[52914]: 2025-10-14 08:05:53.206541346 +0000 UTC m=+0.095154012 container cleanup acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, container_name=container-puppet-iscsid) Oct 14 04:05:53 localhost systemd[1]: libpod-conmon-acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8.scope: Deactivated successfully. Oct 14 04:05:53 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Oct 14 04:05:53 localhost podman[52938]: 2025-10-14 08:05:53.221064314 +0000 UTC m=+0.065067163 container cleanup e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, release=2, vcs-type=git, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, managed_by=tripleo_ansible) Oct 14 04:05:53 localhost systemd[1]: libpod-conmon-e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba.scope: Deactivated successfully. Oct 14 04:05:53 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Oct 14 04:05:53 localhost podman[52867]: 2025-10-14 08:05:53.26593915 +0000 UTC m=+0.237201113 container init 6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.9, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, architecture=x86_64, name=rhosp17/openstack-rsyslog, release=1, com.redhat.component=openstack-rsyslog-container) Oct 14 04:05:53 localhost puppet-user[52175]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 1.24 seconds Oct 14 04:05:53 localhost podman[52867]: 2025-10-14 08:05:53.32870432 +0000 UTC m=+0.299966263 container start 6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=container-puppet-rsyslog, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-rsyslog, version=17.1.9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, build-date=2025-07-21T12:58:40, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:05:53 localhost podman[52867]: 2025-10-14 08:05:53.329507105 +0000 UTC m=+0.300769048 container attach 6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, tcib_managed=true, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T12:58:40, com.redhat.component=openstack-rsyslog-container, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.openshift.expose-services=, version=17.1.9) Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay-02ae85124e4959ea5e505d3d23ffa956e944453d3a9644fd9b15fb0c07f7fbc0-merged.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay-279af57fe640a81799041eadaf076a38dc293fb9fa2d8ceac0fa223bb06cffab-merged.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c07f0587c79df4162acb2af02e0c77b7b00174934856f1af08337b0bb3d6f3c-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay-d1668dbabecea61a717977938a99d4a46ffa99afa4505047a6e5a86838675946-merged.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5e7c2638372aa10cb729f60ba8141c435263d73f9fdc0bea9ed569b948ad58a-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay-66141e0355e434a1428da5b2027ef6192344d1c6afa950636647476e8925671b-merged.mount: Deactivated successfully. Oct 14 04:05:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7a289a1f3de1ca338cb8413f958fb6b1eb7d6cab69c1586cdb10c1b26af17ba-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:53 localhost podman[52985]: 2025-10-14 08:05:53.444138611 +0000 UTC m=+0.148451314 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Oct 14 04:05:53 localhost podman[52985]: 2025-10-14 08:05:53.555663319 +0000 UTC m=+0.259975922 container create bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, architecture=x86_64, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, container_name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Oct 14 04:05:53 localhost systemd[1]: Started libpod-conmon-bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4.scope. Oct 14 04:05:53 localhost systemd[1]: Started libcrun container. Oct 14 04:05:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}0fdf4bb00e72dcbd4ea68fd251936fe0a9549636ee60a27eac1ec895516e4cd2' Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Oct 14 04:05:53 localhost podman[52985]: 2025-10-14 08:05:53.620328159 +0000 UTC m=+0.324640752 container init bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, container_name=container-puppet-ovn_controller, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Oct 14 04:05:53 localhost puppet-user[52175]: Warning: Empty environment setting 'TLS_PASSWORD' Oct 14 04:05:53 localhost puppet-user[52175]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Oct 14 04:05:53 localhost podman[52985]: 2025-10-14 08:05:53.626749941 +0000 UTC m=+0.331062534 container start bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ovn_controller, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44) Oct 14 04:05:53 localhost podman[52985]: 2025-10-14 08:05:53.626876965 +0000 UTC m=+0.331189558 container attach bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, release=1, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=container-puppet-ovn_controller, version=17.1.9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}36f28826b5ffa6d226163e1590f039bf4106b313174c95fd46ed1b050a897488' Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Oct 14 04:05:53 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:54 localhost puppet-user[52747]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:54 localhost puppet-user[52747]: (file & line not available) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:54 localhost puppet-user[52747]: (file & line not available) Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Oct 14 04:05:54 localhost puppet-user[52747]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.37 seconds Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Oct 14 04:05:54 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Oct 14 04:05:54 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Oct 14 04:05:55 localhost puppet-user[53018]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:55 localhost puppet-user[53018]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:55 localhost puppet-user[53018]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:55 localhost puppet-user[53018]: (file & line not available) Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Oct 14 04:05:55 localhost puppet-user[53018]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:55 localhost puppet-user[53018]: (file & line not available) Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Oct 14 04:05:55 localhost puppet-user[53018]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.24 seconds Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Oct 14 04:05:55 localhost puppet-user[52747]: Notice: Applied catalog in 0.41 seconds Oct 14 04:05:55 localhost puppet-user[52747]: Application: Oct 14 04:05:55 localhost puppet-user[52747]: Initial environment: production Oct 14 04:05:55 localhost puppet-user[52747]: Converged environment: production Oct 14 04:05:55 localhost puppet-user[52747]: Run mode: user Oct 14 04:05:55 localhost puppet-user[52747]: Changes: Oct 14 04:05:55 localhost puppet-user[52747]: Total: 31 Oct 14 04:05:55 localhost puppet-user[52747]: Events: Oct 14 04:05:55 localhost puppet-user[52747]: Success: 31 Oct 14 04:05:55 localhost puppet-user[52747]: Total: 31 Oct 14 04:05:55 localhost puppet-user[52747]: Resources: Oct 14 04:05:55 localhost puppet-user[52747]: Skipped: 22 Oct 14 04:05:55 localhost puppet-user[52747]: Changed: 31 Oct 14 04:05:55 localhost puppet-user[52747]: Out of sync: 31 Oct 14 04:05:55 localhost puppet-user[52747]: Total: 151 Oct 14 04:05:55 localhost puppet-user[52747]: Time: Oct 14 04:05:55 localhost puppet-user[52747]: Package: 0.02 Oct 14 04:05:55 localhost puppet-user[52747]: Ceilometer config: 0.30 Oct 14 04:05:55 localhost puppet-user[52747]: Transaction evaluation: 0.40 Oct 14 04:05:55 localhost puppet-user[52747]: Catalog application: 0.41 Oct 14 04:05:55 localhost puppet-user[52747]: Config retrieval: 0.44 Oct 14 04:05:55 localhost puppet-user[52747]: Last run: 1760429155 Oct 14 04:05:55 localhost puppet-user[52747]: Resources: 0.00 Oct 14 04:05:55 localhost puppet-user[52747]: Total: 0.41 Oct 14 04:05:55 localhost puppet-user[52747]: Version: Oct 14 04:05:55 localhost puppet-user[52747]: Config: 1760429154 Oct 14 04:05:55 localhost puppet-user[52747]: Puppet: 7.10.0 Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}12e84d657e52aba69da43e57ae7a44cbb966f3f84d32b3c865ab366a8f5b2c46' Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Oct 14 04:05:55 localhost puppet-user[53018]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Oct 14 04:05:55 localhost puppet-user[53018]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Oct 14 04:05:55 localhost puppet-user[53018]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}f3c4d1793bf2548f76b612b414cb747e06475c87f12d2b30229fa21d9f9f3a9d' Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Oct 14 04:05:55 localhost puppet-user[53018]: Notice: Applied catalog in 0.12 seconds Oct 14 04:05:55 localhost puppet-user[53018]: Application: Oct 14 04:05:55 localhost puppet-user[53018]: Initial environment: production Oct 14 04:05:55 localhost puppet-user[53018]: Converged environment: production Oct 14 04:05:55 localhost puppet-user[53018]: Run mode: user Oct 14 04:05:55 localhost puppet-user[53018]: Changes: Oct 14 04:05:55 localhost puppet-user[53018]: Total: 3 Oct 14 04:05:55 localhost puppet-user[53018]: Events: Oct 14 04:05:55 localhost puppet-user[53018]: Success: 3 Oct 14 04:05:55 localhost puppet-user[53018]: Total: 3 Oct 14 04:05:55 localhost puppet-user[53018]: Resources: Oct 14 04:05:55 localhost puppet-user[53018]: Skipped: 11 Oct 14 04:05:55 localhost puppet-user[53018]: Changed: 3 Oct 14 04:05:55 localhost puppet-user[53018]: Out of sync: 3 Oct 14 04:05:55 localhost puppet-user[53018]: Total: 25 Oct 14 04:05:55 localhost puppet-user[53018]: Time: Oct 14 04:05:55 localhost puppet-user[53018]: Concat file: 0.00 Oct 14 04:05:55 localhost puppet-user[53018]: Concat fragment: 0.00 Oct 14 04:05:55 localhost puppet-user[53018]: File: 0.02 Oct 14 04:05:55 localhost puppet-user[53018]: Transaction evaluation: 0.12 Oct 14 04:05:55 localhost puppet-user[53018]: Catalog application: 0.12 Oct 14 04:05:55 localhost puppet-user[53018]: Config retrieval: 0.29 Oct 14 04:05:55 localhost puppet-user[53018]: Last run: 1760429155 Oct 14 04:05:55 localhost puppet-user[53018]: Total: 0.12 Oct 14 04:05:55 localhost puppet-user[53018]: Version: Oct 14 04:05:55 localhost puppet-user[53018]: Config: 1760429155 Oct 14 04:05:55 localhost puppet-user[53018]: Puppet: 7.10.0 Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:55 localhost puppet-user[53075]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:55 localhost puppet-user[53075]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:55 localhost puppet-user[53075]: (file & line not available) Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:55 localhost puppet-user[53075]: (file & line not available) Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.22 seconds Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Oct 14 04:05:55 localhost systemd[1]: libpod-d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57.scope: Deactivated successfully. Oct 14 04:05:55 localhost systemd[1]: libpod-d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57.scope: Consumed 2.970s CPU time. Oct 14 04:05:55 localhost podman[52645]: 2025-10-14 08:05:55.755598053 +0000 UTC m=+3.420550858 container died d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-central-container, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, config_id=tripleo_puppet_step1, io.openshift.expose-services=, container_name=container-puppet-ceilometer, batch=17.1_20250721.1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-07-21T14:49:23) Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53412]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Oct 14 04:05:55 localhost systemd[1]: libpod-6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c.scope: Deactivated successfully. Oct 14 04:05:55 localhost systemd[1]: libpod-6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c.scope: Consumed 2.390s CPU time. Oct 14 04:05:55 localhost ovs-vsctl[53421]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Oct 14 04:05:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:55 localhost systemd[1]: var-lib-containers-storage-overlay-215025152e7486dca6aa506e7e941c98eca167be4a4853b2a3771ef4f2b39afc-merged.mount: Deactivated successfully. Oct 14 04:05:55 localhost podman[52867]: 2025-10-14 08:05:55.843647171 +0000 UTC m=+2.814909094 container died 6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=container-puppet-rsyslog, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, batch=17.1_20250721.1, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, tcib_managed=true, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T12:58:40) Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53426]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.108 Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53434]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005486733.localdomain Oct 14 04:05:55 localhost podman[53396]: 2025-10-14 08:05:55.889150706 +0000 UTC m=+0.122240077 container cleanup d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, container_name=container-puppet-ceilometer, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-central, version=17.1.9, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-07-21T14:49:23, com.redhat.component=openstack-ceilometer-central-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Oct 14 04:05:55 localhost systemd[1]: libpod-conmon-d76960b3ef10f2e9e25c8f677807e95f6772d682b73e83100105989c556baa57.scope: Deactivated successfully. Oct 14 04:05:55 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005486733.novalocal' to 'np0005486733.localdomain' Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53442]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53447]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53454]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Oct 14 04:05:55 localhost ovs-vsctl[53464]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Oct 14 04:05:55 localhost podman[53423]: 2025-10-14 08:05:55.993756385 +0000 UTC m=+0.135891207 container cleanup 6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, vendor=Red Hat, Inc., architecture=x86_64, container_name=container-puppet-rsyslog, vcs-type=git, config_id=tripleo_puppet_step1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=) Oct 14 04:05:55 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Oct 14 04:05:55 localhost systemd[1]: libpod-conmon-6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c.scope: Deactivated successfully. Oct 14 04:05:56 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Oct 14 04:05:56 localhost ovs-vsctl[53470]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Oct 14 04:05:56 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Oct 14 04:05:56 localhost ovs-vsctl[53480]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Oct 14 04:05:56 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Oct 14 04:05:56 localhost ovs-vsctl[53484]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:6d:4a:99 Oct 14 04:05:56 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Oct 14 04:05:56 localhost ovs-vsctl[53487]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Oct 14 04:05:56 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Oct 14 04:05:56 localhost ovs-vsctl[53500]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Oct 14 04:05:56 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Oct 14 04:05:56 localhost ovs-vsctl[53502]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Oct 14 04:05:56 localhost puppet-user[53075]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Oct 14 04:05:56 localhost puppet-user[53075]: Notice: Applied catalog in 0.42 seconds Oct 14 04:05:56 localhost puppet-user[53075]: Application: Oct 14 04:05:56 localhost puppet-user[53075]: Initial environment: production Oct 14 04:05:56 localhost puppet-user[53075]: Converged environment: production Oct 14 04:05:56 localhost puppet-user[53075]: Run mode: user Oct 14 04:05:56 localhost puppet-user[53075]: Changes: Oct 14 04:05:56 localhost puppet-user[53075]: Total: 14 Oct 14 04:05:56 localhost puppet-user[53075]: Events: Oct 14 04:05:56 localhost puppet-user[53075]: Success: 14 Oct 14 04:05:56 localhost puppet-user[53075]: Total: 14 Oct 14 04:05:56 localhost puppet-user[53075]: Resources: Oct 14 04:05:56 localhost puppet-user[53075]: Skipped: 12 Oct 14 04:05:56 localhost puppet-user[53075]: Changed: 14 Oct 14 04:05:56 localhost puppet-user[53075]: Out of sync: 14 Oct 14 04:05:56 localhost puppet-user[53075]: Total: 29 Oct 14 04:05:56 localhost puppet-user[53075]: Time: Oct 14 04:05:56 localhost puppet-user[53075]: Exec: 0.01 Oct 14 04:05:56 localhost puppet-user[53075]: Config retrieval: 0.27 Oct 14 04:05:56 localhost puppet-user[53075]: Vs config: 0.35 Oct 14 04:05:56 localhost puppet-user[53075]: Transaction evaluation: 0.40 Oct 14 04:05:56 localhost puppet-user[53075]: Catalog application: 0.42 Oct 14 04:05:56 localhost puppet-user[53075]: Last run: 1760429156 Oct 14 04:05:56 localhost puppet-user[53075]: Total: 0.42 Oct 14 04:05:56 localhost puppet-user[53075]: Version: Oct 14 04:05:56 localhost puppet-user[53075]: Config: 1760429155 Oct 14 04:05:56 localhost puppet-user[53075]: Puppet: 7.10.0 Oct 14 04:05:56 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Oct 14 04:05:56 localhost systemd[1]: libpod-bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4.scope: Deactivated successfully. Oct 14 04:05:56 localhost systemd[1]: libpod-bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4.scope: Consumed 2.810s CPU time. Oct 14 04:05:56 localhost podman[52985]: 2025-10-14 08:05:56.657599536 +0000 UTC m=+3.361912149 container died bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, container_name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Oct 14 04:05:56 localhost systemd[1]: var-lib-containers-storage-overlay-f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119-merged.mount: Deactivated successfully. Oct 14 04:05:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:56 localhost systemd[1]: var-lib-containers-storage-overlay-edfe04f9a55df360a7193df528cae2e7de5655e253d67fa05d14b0067469a682-merged.mount: Deactivated successfully. Oct 14 04:05:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Oct 14 04:05:57 localhost podman[53553]: 2025-10-14 08:05:57.263838859 +0000 UTC m=+0.600869345 container cleanup bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, config_id=tripleo_puppet_step1, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-ovn-controller, container_name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible) Oct 14 04:05:57 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Oct 14 04:05:57 localhost systemd[1]: libpod-conmon-bb72d998034803f1d22964fba0c703967fd4ffa64b183db41d3e23363af093c4.scope: Deactivated successfully. Oct 14 04:05:57 localhost podman[53043]: 2025-10-14 08:05:53.487320654 +0000 UTC m=+0.041968745 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Oct 14 04:05:57 localhost podman[53824]: 2025-10-14 08:05:57.487920728 +0000 UTC m=+0.087214443 container create 7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, maintainer=OpenStack TripleO Team, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T15:44:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, release=1, name=rhosp17/openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, architecture=x86_64, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, container_name=container-puppet-neutron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_puppet_step1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Oct 14 04:05:57 localhost systemd[1]: Started libpod-conmon-7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b.scope. Oct 14 04:05:57 localhost podman[53824]: 2025-10-14 08:05:57.444114916 +0000 UTC m=+0.043408601 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Oct 14 04:05:57 localhost systemd[1]: Started libcrun container. Oct 14 04:05:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18fed1bc5c055eece5466d40a513df73328df93a77e2aad253cd120d7b08bd42/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Oct 14 04:05:57 localhost podman[53824]: 2025-10-14 08:05:57.571800604 +0000 UTC m=+0.171094329 container init 7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, version=17.1.9, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, build-date=2025-07-21T15:44:03, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, com.redhat.component=openstack-neutron-server-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, name=rhosp17/openstack-neutron-server, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=container-puppet-neutron, summary=Red Hat OpenStack Platform 17.1 neutron-server) Oct 14 04:05:57 localhost podman[53824]: 2025-10-14 08:05:57.583620776 +0000 UTC m=+0.182914461 container start 7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-server, build-date=2025-07-21T15:44:03, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-server-container, version=17.1.9, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, container_name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:05:57 localhost podman[53824]: 2025-10-14 08:05:57.583797242 +0000 UTC m=+0.183090937 container attach 7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, container_name=container-puppet-neutron, vcs-type=git, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, com.redhat.component=openstack-neutron-server-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, version=17.1.9, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-server, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, release=1, batch=17.1_20250721.1, build-date=2025-07-21T15:44:03, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, io.buildah.version=1.33.12) Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Oct 14 04:05:57 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Oct 14 04:05:58 localhost puppet-user[52175]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}66a7ab6cc1a19ea5002a5aaa2cfb2f196778c89c859d0afac926fe3fac9c75a4' Oct 14 04:05:58 localhost puppet-user[52175]: Notice: Applied catalog in 4.79 seconds Oct 14 04:05:58 localhost puppet-user[52175]: Application: Oct 14 04:05:58 localhost puppet-user[52175]: Initial environment: production Oct 14 04:05:58 localhost puppet-user[52175]: Converged environment: production Oct 14 04:05:58 localhost puppet-user[52175]: Run mode: user Oct 14 04:05:58 localhost puppet-user[52175]: Changes: Oct 14 04:05:58 localhost puppet-user[52175]: Total: 183 Oct 14 04:05:58 localhost puppet-user[52175]: Events: Oct 14 04:05:58 localhost puppet-user[52175]: Success: 183 Oct 14 04:05:58 localhost puppet-user[52175]: Total: 183 Oct 14 04:05:58 localhost puppet-user[52175]: Resources: Oct 14 04:05:58 localhost puppet-user[52175]: Changed: 183 Oct 14 04:05:58 localhost puppet-user[52175]: Out of sync: 183 Oct 14 04:05:58 localhost puppet-user[52175]: Skipped: 57 Oct 14 04:05:58 localhost puppet-user[52175]: Total: 487 Oct 14 04:05:58 localhost puppet-user[52175]: Time: Oct 14 04:05:58 localhost puppet-user[52175]: Concat fragment: 0.00 Oct 14 04:05:58 localhost puppet-user[52175]: Anchor: 0.00 Oct 14 04:05:58 localhost puppet-user[52175]: File line: 0.00 Oct 14 04:05:58 localhost puppet-user[52175]: Virtlogd config: 0.00 Oct 14 04:05:58 localhost puppet-user[52175]: Exec: 0.01 Oct 14 04:05:58 localhost puppet-user[52175]: Virtstoraged config: 0.02 Oct 14 04:05:58 localhost puppet-user[52175]: Virtqemud config: 0.02 Oct 14 04:05:58 localhost puppet-user[52175]: Package: 0.03 Oct 14 04:05:58 localhost puppet-user[52175]: Virtproxyd config: 0.03 Oct 14 04:05:58 localhost puppet-user[52175]: Virtsecretd config: 0.03 Oct 14 04:05:58 localhost puppet-user[52175]: File: 0.03 Oct 14 04:05:58 localhost puppet-user[52175]: Virtnodedevd config: 0.05 Oct 14 04:05:58 localhost puppet-user[52175]: Augeas: 1.29 Oct 14 04:05:58 localhost puppet-user[52175]: Config retrieval: 1.48 Oct 14 04:05:58 localhost puppet-user[52175]: Last run: 1760429158 Oct 14 04:05:58 localhost puppet-user[52175]: Nova config: 3.07 Oct 14 04:05:58 localhost puppet-user[52175]: Transaction evaluation: 4.77 Oct 14 04:05:58 localhost puppet-user[52175]: Catalog application: 4.79 Oct 14 04:05:58 localhost puppet-user[52175]: Resources: 0.00 Oct 14 04:05:58 localhost puppet-user[52175]: Concat file: 0.00 Oct 14 04:05:58 localhost puppet-user[52175]: Total: 4.79 Oct 14 04:05:58 localhost puppet-user[52175]: Version: Oct 14 04:05:58 localhost puppet-user[52175]: Config: 1760429152 Oct 14 04:05:58 localhost puppet-user[52175]: Puppet: 7.10.0 Oct 14 04:05:59 localhost systemd[1]: libpod-4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb.scope: Deactivated successfully. Oct 14 04:05:59 localhost systemd[1]: libpod-4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb.scope: Consumed 8.415s CPU time. Oct 14 04:05:59 localhost podman[52066]: 2025-10-14 08:05:59.15712889 +0000 UTC m=+10.192908344 container died 4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, release=2, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:56:59, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, container_name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 04:05:59 localhost systemd[1]: tmp-crun.sfZHdo.mount: Deactivated successfully. Oct 14 04:05:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb-userdata-shm.mount: Deactivated successfully. Oct 14 04:05:59 localhost systemd[1]: var-lib-containers-storage-overlay-705239d69edbb97c498d74570c74cd8434e37024eb25add7e89f19b22fc90898-merged.mount: Deactivated successfully. Oct 14 04:05:59 localhost podman[53897]: 2025-10-14 08:05:59.313832853 +0000 UTC m=+0.147552345 container cleanup 4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, build-date=2025-07-21T14:56:59, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=container-puppet-nova_libvirt, distribution-scope=public) Oct 14 04:05:59 localhost systemd[1]: libpod-conmon-4c438342fb722ce7a3d59e73e41c2395edc0f72e837b1d6bb00dfb2e80e443fb.scope: Deactivated successfully. Oct 14 04:05:59 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:05:59 localhost puppet-user[53854]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Oct 14 04:05:59 localhost puppet-user[53854]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:05:59 localhost puppet-user[53854]: (file: /etc/puppet/hiera.yaml) Oct 14 04:05:59 localhost puppet-user[53854]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:05:59 localhost puppet-user[53854]: (file & line not available) Oct 14 04:05:59 localhost puppet-user[53854]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:05:59 localhost puppet-user[53854]: (file & line not available) Oct 14 04:05:59 localhost puppet-user[53854]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Oct 14 04:06:00 localhost puppet-user[53854]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.64 seconds Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Oct 14 04:06:00 localhost puppet-user[53854]: Notice: Applied catalog in 0.62 seconds Oct 14 04:06:00 localhost puppet-user[53854]: Application: Oct 14 04:06:00 localhost puppet-user[53854]: Initial environment: production Oct 14 04:06:00 localhost puppet-user[53854]: Converged environment: production Oct 14 04:06:00 localhost puppet-user[53854]: Run mode: user Oct 14 04:06:00 localhost puppet-user[53854]: Changes: Oct 14 04:06:00 localhost puppet-user[53854]: Total: 33 Oct 14 04:06:00 localhost puppet-user[53854]: Events: Oct 14 04:06:00 localhost puppet-user[53854]: Success: 33 Oct 14 04:06:00 localhost puppet-user[53854]: Total: 33 Oct 14 04:06:00 localhost puppet-user[53854]: Resources: Oct 14 04:06:00 localhost puppet-user[53854]: Skipped: 21 Oct 14 04:06:00 localhost puppet-user[53854]: Changed: 33 Oct 14 04:06:00 localhost puppet-user[53854]: Out of sync: 33 Oct 14 04:06:00 localhost puppet-user[53854]: Total: 155 Oct 14 04:06:00 localhost puppet-user[53854]: Time: Oct 14 04:06:00 localhost puppet-user[53854]: Resources: 0.00 Oct 14 04:06:00 localhost puppet-user[53854]: Ovn metadata agent config: 0.02 Oct 14 04:06:00 localhost puppet-user[53854]: Neutron config: 0.53 Oct 14 04:06:00 localhost puppet-user[53854]: Transaction evaluation: 0.60 Oct 14 04:06:00 localhost puppet-user[53854]: Catalog application: 0.62 Oct 14 04:06:00 localhost puppet-user[53854]: Config retrieval: 0.71 Oct 14 04:06:00 localhost puppet-user[53854]: Last run: 1760429160 Oct 14 04:06:00 localhost puppet-user[53854]: Total: 0.62 Oct 14 04:06:00 localhost puppet-user[53854]: Version: Oct 14 04:06:00 localhost puppet-user[53854]: Config: 1760429159 Oct 14 04:06:00 localhost puppet-user[53854]: Puppet: 7.10.0 Oct 14 04:06:01 localhost systemd[1]: libpod-7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b.scope: Deactivated successfully. Oct 14 04:06:01 localhost systemd[1]: libpod-7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b.scope: Consumed 3.841s CPU time. Oct 14 04:06:01 localhost podman[53824]: 2025-10-14 08:06:01.479030232 +0000 UTC m=+4.078323947 container died 7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=container-puppet-neutron, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, version=17.1.9, release=1, build-date=2025-07-21T15:44:03, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, com.redhat.component=openstack-neutron-server-container, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server) Oct 14 04:06:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b-userdata-shm.mount: Deactivated successfully. Oct 14 04:06:01 localhost systemd[1]: var-lib-containers-storage-overlay-18fed1bc5c055eece5466d40a513df73328df93a77e2aad253cd120d7b08bd42-merged.mount: Deactivated successfully. Oct 14 04:06:01 localhost podman[54039]: 2025-10-14 08:06:01.633938929 +0000 UTC m=+0.145875842 container cleanup 7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-server, build-date=2025-07-21T15:44:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, container_name=container-puppet-neutron, distribution-scope=public, vcs-type=git, config_id=tripleo_puppet_step1, release=1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, com.redhat.component=openstack-neutron-server-container, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Oct 14 04:06:01 localhost systemd[1]: libpod-conmon-7ab2c88e5d6693c43a3a41954d512afdeb313affcbacb2ba23cfd899fd39565b.scope: Deactivated successfully. Oct 14 04:06:01 localhost python3[51855]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486733 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486733', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Oct 14 04:06:02 localhost python3[54092]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:03 localhost python3[54124]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:06:04 localhost python3[54174]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:06:04 localhost python3[54217]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429163.8217332-84687-204268284211178/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:05 localhost python3[54279]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:06:05 localhost python3[54322]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429164.7064316-84687-157677558191253/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:06 localhost python3[54384]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:06:06 localhost python3[54427]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429165.7102485-84777-122681233443371/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:06 localhost python3[54489]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:06:07 localhost python3[54532]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429166.6744587-84880-127175534107862/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:07 localhost python3[54562]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:06:07 localhost systemd[1]: Reloading. Oct 14 04:06:08 localhost systemd-rc-local-generator[54585]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:06:08 localhost systemd-sysv-generator[54589]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:06:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:06:08 localhost systemd[1]: Reloading. Oct 14 04:06:08 localhost systemd-rc-local-generator[54628]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:06:08 localhost systemd-sysv-generator[54631]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:06:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:06:08 localhost systemd[1]: Starting TripleO Container Shutdown... Oct 14 04:06:08 localhost systemd[1]: Finished TripleO Container Shutdown. Oct 14 04:06:08 localhost python3[54686]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:06:09 localhost python3[54729]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429168.6939855-84947-82672191963146/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:09 localhost python3[54791]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:06:10 localhost python3[54834]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429169.6190224-84972-87633677248596/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:10 localhost python3[54864]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:06:10 localhost systemd[1]: Reloading. Oct 14 04:06:10 localhost systemd-rc-local-generator[54890]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:06:10 localhost systemd-sysv-generator[54895]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:06:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:06:11 localhost systemd[1]: Reloading. Oct 14 04:06:11 localhost systemd-rc-local-generator[54926]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:06:11 localhost systemd-sysv-generator[54930]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:06:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:06:11 localhost systemd[1]: Starting Create netns directory... Oct 14 04:06:11 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 04:06:11 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 04:06:11 localhost systemd[1]: Finished Create netns directory. Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 8203f25645ef4c13974e350f23db228e Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: d31718fcd17fdeee6489534105191c7a Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 49c4309af9a4fea3d3f53b6222780f5a Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: dda1083e68f30de2da9a23107b96824d Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 0fa4c62fe8881d1f7112b22e9fd9421c Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 0fa4c62fe8881d1f7112b22e9fd9421c Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 0a131c335ed9f542ed2a9fb22aa1dfa8 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:11 localhost python3[54957]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 4d186a6228facd5bcddf9bcc145eb470 Oct 14 04:06:13 localhost python3[55013]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Oct 14 04:06:13 localhost podman[55052]: 2025-10-14 08:06:13.732468062 +0000 UTC m=+0.093036085 container create 72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, architecture=x86_64, container_name=metrics_qdr_init_logs, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:06:13 localhost podman[55052]: 2025-10-14 08:06:13.688522447 +0000 UTC m=+0.049090480 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Oct 14 04:06:13 localhost systemd[1]: Started libpod-conmon-72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb.scope. Oct 14 04:06:13 localhost systemd[1]: Started libcrun container. Oct 14 04:06:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7318240cee041ece4858dfb7aae8a7f672f3a389f0d73c3ccdde9a227d0af2bb/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Oct 14 04:06:13 localhost podman[55052]: 2025-10-14 08:06:13.835022698 +0000 UTC m=+0.195590721 container init 72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, distribution-scope=public, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.9, release=1, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, tcib_managed=true) Oct 14 04:06:13 localhost podman[55052]: 2025-10-14 08:06:13.847067517 +0000 UTC m=+0.207635540 container start 72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-type=git, architecture=x86_64, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, config_id=tripleo_step1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:06:13 localhost podman[55052]: 2025-10-14 08:06:13.847308075 +0000 UTC m=+0.207876098 container attach 72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, distribution-scope=public, release=1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd) Oct 14 04:06:13 localhost podman[55052]: 2025-10-14 08:06:13.855175603 +0000 UTC m=+0.215743606 container died 72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, release=1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.9, container_name=metrics_qdr_init_logs, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd) Oct 14 04:06:13 localhost systemd[1]: libpod-72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb.scope: Deactivated successfully. Oct 14 04:06:13 localhost podman[55072]: 2025-10-14 08:06:13.947750583 +0000 UTC m=+0.081240143 container cleanup 72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T13:07:59, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1) Oct 14 04:06:13 localhost systemd[1]: libpod-conmon-72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb.scope: Deactivated successfully. Oct 14 04:06:13 localhost python3[55013]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Oct 14 04:06:14 localhost podman[55148]: 2025-10-14 08:06:14.401401553 +0000 UTC m=+0.075956367 container create 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, release=1, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20250721.1, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 04:06:14 localhost systemd[1]: Started libpod-conmon-1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.scope. Oct 14 04:06:14 localhost systemd[1]: Started libcrun container. Oct 14 04:06:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/168207db095cdd373b28e32e9bd8a2aa29e7cbcdf9040af1b44bb5a093e7f31e/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Oct 14 04:06:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/168207db095cdd373b28e32e9bd8a2aa29e7cbcdf9040af1b44bb5a093e7f31e/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Oct 14 04:06:14 localhost podman[55148]: 2025-10-14 08:06:14.362835497 +0000 UTC m=+0.037390331 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Oct 14 04:06:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:06:14 localhost podman[55148]: 2025-10-14 08:06:14.484944779 +0000 UTC m=+0.159499653 container init 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T13:07:59, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:06:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:06:14 localhost podman[55148]: 2025-10-14 08:06:14.518934121 +0000 UTC m=+0.193488975 container start 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, vcs-type=git) Oct 14 04:06:14 localhost python3[55013]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=8203f25645ef4c13974e350f23db228e --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Oct 14 04:06:14 localhost podman[55170]: 2025-10-14 08:06:14.617240402 +0000 UTC m=+0.088198893 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, release=1, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public) Oct 14 04:06:14 localhost systemd[1]: var-lib-containers-storage-overlay-7318240cee041ece4858dfb7aae8a7f672f3a389f0d73c3ccdde9a227d0af2bb-merged.mount: Deactivated successfully. Oct 14 04:06:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb-userdata-shm.mount: Deactivated successfully. Oct 14 04:06:14 localhost podman[55170]: 2025-10-14 08:06:14.816086484 +0000 UTC m=+0.287044905 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:06:14 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:06:15 localhost python3[55244]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:15 localhost python3[55260]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:06:15 localhost python3[55321]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429175.4501889-85146-5656874214453/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:16 localhost python3[55337]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 04:06:16 localhost systemd[1]: Reloading. Oct 14 04:06:16 localhost systemd-rc-local-generator[55360]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:06:16 localhost systemd-sysv-generator[55363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:06:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:06:17 localhost python3[55389]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:06:17 localhost systemd[1]: Reloading. Oct 14 04:06:17 localhost systemd-sysv-generator[55418]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:06:17 localhost systemd-rc-local-generator[55414]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:06:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:06:17 localhost systemd[1]: Starting metrics_qdr container... Oct 14 04:06:17 localhost systemd[1]: Started metrics_qdr container. Oct 14 04:06:18 localhost python3[55471]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:19 localhost python3[55592]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005486733 step=1 update_config_hash_only=False Oct 14 04:06:20 localhost python3[55608]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:06:20 localhost python3[55624]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Oct 14 04:06:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:06:45 localhost systemd[1]: tmp-crun.HrbY4S.mount: Deactivated successfully. Oct 14 04:06:45 localhost podman[55701]: 2025-10-14 08:06:45.75987486 +0000 UTC m=+0.096565076 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, managed_by=tripleo_ansible, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:06:45 localhost podman[55701]: 2025-10-14 08:06:45.966077325 +0000 UTC m=+0.302767541 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20250721.1, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:06:45 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:07:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:07:16 localhost podman[55731]: 2025-10-14 08:07:16.746724604 +0000 UTC m=+0.075974577 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1) Oct 14 04:07:16 localhost podman[55731]: 2025-10-14 08:07:16.940094984 +0000 UTC m=+0.269344977 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_id=tripleo_step1, version=17.1.9, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:07:16 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:07:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:07:47 localhost podman[55839]: 2025-10-14 08:07:47.73365222 +0000 UTC m=+0.079594430 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step1, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:07:47 localhost podman[55839]: 2025-10-14 08:07:47.929913343 +0000 UTC m=+0.275855503 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc.) Oct 14 04:07:47 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:08:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:08:18 localhost podman[55869]: 2025-10-14 08:08:18.739489954 +0000 UTC m=+0.083587904 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, release=1, container_name=metrics_qdr, config_id=tripleo_step1, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:08:18 localhost podman[55869]: 2025-10-14 08:08:18.964255941 +0000 UTC m=+0.308353871 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, release=1, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:08:18 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:08:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:08:49 localhost systemd[1]: tmp-crun.idZtGc.mount: Deactivated successfully. Oct 14 04:08:49 localhost podman[55976]: 2025-10-14 08:08:49.736067691 +0000 UTC m=+0.080154009 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:08:49 localhost podman[55976]: 2025-10-14 08:08:49.927998598 +0000 UTC m=+0.272084966 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:08:49 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:09:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:09:20 localhost systemd[1]: tmp-crun.aaBr6W.mount: Deactivated successfully. Oct 14 04:09:20 localhost podman[56006]: 2025-10-14 08:09:20.737731073 +0000 UTC m=+0.081237672 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, version=17.1.9, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:09:20 localhost podman[56006]: 2025-10-14 08:09:20.931242339 +0000 UTC m=+0.274748858 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, release=1, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:09:20 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:09:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:09:51 localhost podman[56112]: 2025-10-14 08:09:51.741072778 +0000 UTC m=+0.084201280 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20250721.1, version=17.1.9, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:09:51 localhost podman[56112]: 2025-10-14 08:09:51.980043466 +0000 UTC m=+0.323171908 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.9, container_name=metrics_qdr, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59) Oct 14 04:09:51 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:10:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:10:22 localhost systemd[1]: tmp-crun.kHC1Id.mount: Deactivated successfully. Oct 14 04:10:22 localhost podman[56141]: 2025-10-14 08:10:22.737377102 +0000 UTC m=+0.075059853 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step1, io.buildah.version=1.33.12, release=1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc.) Oct 14 04:10:22 localhost podman[56141]: 2025-10-14 08:10:22.975551356 +0000 UTC m=+0.313234057 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr) Oct 14 04:10:22 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:10:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:10:53 localhost systemd[1]: tmp-crun.VWNx3C.mount: Deactivated successfully. Oct 14 04:10:53 localhost podman[56249]: 2025-10-14 08:10:53.73732113 +0000 UTC m=+0.081746187 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, container_name=metrics_qdr, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team) Oct 14 04:10:53 localhost podman[56249]: 2025-10-14 08:10:53.955305122 +0000 UTC m=+0.299730129 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.buildah.version=1.33.12) Oct 14 04:10:53 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:11:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:11:24 localhost podman[56279]: 2025-10-14 08:11:24.734867695 +0000 UTC m=+0.079015033 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, release=1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:11:24 localhost podman[56279]: 2025-10-14 08:11:24.936168723 +0000 UTC m=+0.280316081 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vcs-type=git, batch=17.1_20250721.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., release=1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container) Oct 14 04:11:24 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:11:35 localhost ceph-osd[32440]: osd.3 pg_epoch: 18 pg[2.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [5,2,3] r=2 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:11:36 localhost ceph-osd[31500]: osd.0 pg_epoch: 20 pg[3.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [2,1,0] r=2 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:11:36 localhost ceph-osd[32440]: osd.3 pg_epoch: 21 pg[4.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [3,5,2] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:11:37 localhost sshd[56309]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:11:38 localhost ceph-osd[32440]: osd.3 pg_epoch: 22 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [3,5,2] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:11:39 localhost ceph-osd[32440]: osd.3 pg_epoch: 22 pg[5.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [2,3,1] r=1 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:11:51 localhost ceph-osd[31500]: osd.0 pg_epoch: 28 pg[6.0( empty local-lis/les=0/0 n=0 ec=28/28 lis/c=0/0 les/c/f=0/0/0 sis=28) [0,5,2] r=0 lpr=28 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:11:52 localhost ceph-osd[31500]: osd.0 pg_epoch: 29 pg[6.0( empty local-lis/les=28/29 n=0 ec=28/28 lis/c=0/0 les/c/f=0/0/0 sis=28) [0,5,2] r=0 lpr=28 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:11:54 localhost ceph-osd[32440]: osd.3 pg_epoch: 29 pg[7.0( empty local-lis/les=0/0 n=0 ec=29/29 lis/c=0/0 les/c/f=0/0/0 sis=29) [5,4,3] r=2 lpr=29 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:11:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:11:55 localhost podman[56389]: 2025-10-14 08:11:55.723904532 +0000 UTC m=+0.067959394 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:11:55 localhost podman[56389]: 2025-10-14 08:11:55.931347103 +0000 UTC m=+0.275401985 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:11:55 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:12:15 localhost ceph-osd[32440]: osd.3 pg_epoch: 33 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33 pruub=8.168042183s) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active pruub 1177.721557617s@ mbc={}] start_peering_interval up [5,2,3] -> [5,2,3], acting [5,2,3] -> [5,2,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:15 localhost ceph-osd[32440]: osd.3 pg_epoch: 33 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33 pruub=8.166176796s) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1177.721557617s@ mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1f( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1e( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1d( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1a( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1c( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.19( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1b( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.8( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.1( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.4( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.2( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.5( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.7( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.6( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.3( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.9( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.a( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.b( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.c( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.d( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.e( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.f( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.10( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.12( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.11( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.13( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.14( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.17( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.15( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.18( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:16 localhost ceph-osd[32440]: osd.3 pg_epoch: 34 pg[2.16( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [5,2,3] r=2 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=35 pruub=15.732213020s) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 active pruub 1191.556274414s@ mbc={}] start_peering_interval up [2,1,0] -> [2,1,0], acting [2,1,0] -> [2,1,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.012010574s) [4,1,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588256836s@ mbc={}] start_peering_interval up [5,2,3] -> [4,1,3], acting [5,2,3] -> [4,1,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011944771s) [4,1,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588256836s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011978149s) [4,0,5] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588256836s@ mbc={}] start_peering_interval up [5,2,3] -> [4,0,5], acting [5,2,3] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.012033463s) [5,2,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588378906s@ mbc={}] start_peering_interval up [5,2,3] -> [5,2,0], acting [5,2,3] -> [5,2,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.14( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011924744s) [5,4,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588256836s@ mbc={}] start_peering_interval up [5,2,3] -> [5,4,0], acting [5,2,3] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.14( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011830330s) [5,4,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588256836s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011846542s) [4,0,5] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588256836s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011686325s) [5,2,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588378906s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011466026s) [2,5,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588378906s@ mbc={}] start_peering_interval up [5,2,3] -> [2,5,3], acting [5,2,3] -> [2,5,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.12( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010694504s) [2,3,1] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587646484s@ mbc={}] start_peering_interval up [5,2,3] -> [2,3,1], acting [5,2,3] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.011409760s) [2,5,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588378906s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.12( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010643959s) [2,3,1] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587646484s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010786057s) [5,3,2] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587890625s@ mbc={}] start_peering_interval up [5,2,3] -> [5,3,2], acting [5,2,3] -> [5,3,2], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.10( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010797501s) [2,0,5] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587890625s@ mbc={}] start_peering_interval up [5,2,3] -> [2,0,5], acting [5,2,3] -> [2,0,5], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010727882s) [5,3,2] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587890625s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.10( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010739326s) [2,0,5] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587890625s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010265350s) [2,1,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587646484s@ mbc={}] start_peering_interval up [5,2,3] -> [2,1,0], acting [5,2,3] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010337830s) [4,1,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587646484s@ mbc={}] start_peering_interval up [5,2,3] -> [4,1,3], acting [5,2,3] -> [4,1,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.e( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010436058s) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587768555s@ mbc={}] start_peering_interval up [5,2,3] -> [3,4,5], acting [5,2,3] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010265350s) [4,1,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587646484s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.c( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010596275s) [2,0,1] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588012695s@ mbc={}] start_peering_interval up [5,2,3] -> [2,0,1], acting [5,2,3] -> [2,0,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010177612s) [2,1,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587646484s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.e( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010436058s) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.587768555s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.c( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010559082s) [2,0,1] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588012695s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010403633s) [4,5,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588012695s@ mbc={}] start_peering_interval up [5,2,3] -> [4,5,0], acting [5,2,3] -> [4,5,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009862900s) [2,3,1] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587524414s@ mbc={}] start_peering_interval up [5,2,3] -> [2,3,1], acting [5,2,3] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010334969s) [4,5,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588012695s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009704590s) [3,1,4] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587524414s@ mbc={}] start_peering_interval up [5,2,3] -> [3,1,4], acting [5,2,3] -> [3,1,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009788513s) [2,3,1] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587524414s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009704590s) [3,1,4] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.587524414s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009025574s) [5,3,2] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.586791992s@ mbc={}] start_peering_interval up [5,2,3] -> [5,3,2], acting [5,2,3] -> [5,3,2], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.008968353s) [5,3,2] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.586791992s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=35 pruub=8.944942474s) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active pruub 1180.522949219s@ mbc={}] start_peering_interval up [3,5,2] -> [3,5,2], acting [3,5,2] -> [3,5,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009499550s) [3,2,1] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587524414s@ mbc={}] start_peering_interval up [5,2,3] -> [3,2,1], acting [5,2,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.009499550s) [3,2,1] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.587524414s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010208130s) [2,0,1] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.588378906s@ mbc={}] start_peering_interval up [5,2,3] -> [2,0,1], acting [5,2,3] -> [2,0,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.004164696s) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.582397461s@ mbc={}] start_peering_interval up [5,2,3] -> [3,4,5], acting [5,2,3] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.010116577s) [2,0,1] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.588378906s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.004164696s) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.582397461s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.008090019s) [3,2,1] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.586791992s@ mbc={}] start_peering_interval up [5,2,3] -> [3,2,1], acting [5,2,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.008090019s) [3,2,1] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.586791992s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.007407188s) [5,0,2] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.586303711s@ mbc={}] start_peering_interval up [5,2,3] -> [5,0,2], acting [5,2,3] -> [5,0,2], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.002880096s) [1,2,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.581665039s@ mbc={}] start_peering_interval up [5,2,3] -> [1,2,0], acting [5,2,3] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.007355690s) [5,0,2] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.586303711s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.002770424s) [1,2,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.581665039s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.007686615s) [1,2,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.586669922s@ mbc={}] start_peering_interval up [5,2,3] -> [1,2,3], acting [5,2,3] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.002489090s) [3,5,2] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.581542969s@ mbc={}] start_peering_interval up [5,2,3] -> [3,5,2], acting [5,2,3] -> [3,5,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1a( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.008250237s) [5,4,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.587524414s@ mbc={}] start_peering_interval up [5,2,3] -> [5,4,3], acting [5,2,3] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.002489090s) [3,5,2] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.581542969s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1a( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.008194923s) [5,4,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.587524414s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.002889633s) [2,5,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.582275391s@ mbc={}] start_peering_interval up [5,2,3] -> [2,5,3], acting [5,2,3] -> [2,5,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.002835274s) [2,5,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.582275391s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.006642342s) [4,1,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.586181641s@ mbc={}] start_peering_interval up [5,2,3] -> [4,1,0], acting [5,2,3] -> [4,1,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.003362656s) [4,5,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.582885742s@ mbc={}] start_peering_interval up [5,2,3] -> [4,5,0], acting [5,2,3] -> [4,5,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.006575584s) [4,1,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.586181641s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.003265381s) [4,5,0] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.582885742s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1e( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.003407478s) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.583251953s@ mbc={}] start_peering_interval up [5,2,3] -> [3,4,5], acting [5,2,3] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.003489494s) [0,1,4] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active pruub 1186.583374023s@ mbc={}] start_peering_interval up [5,2,3] -> [0,1,4], acting [5,2,3] -> [0,1,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:17 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [0,1,4] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1e( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.003407478s) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.583251953s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.003440857s) [0,1,4] r=-1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.583374023s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35 pruub=15.007259369s) [1,2,3] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.586669922s@ mbc={}] state: transitioning to Stray Oct 14 04:12:17 localhost ceph-osd[32440]: osd.3 pg_epoch: 35 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=35 pruub=8.944942474s) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1180.522949219s@ mbc={}] state: transitioning to Primary Oct 14 04:12:17 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=35 pruub=15.724010468s) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.556274414s@ mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1e( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1d( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.a( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.b( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1f( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.9( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.7( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.8( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.6( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.4( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.3( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.2( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1c( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.d( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.5( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.f( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.10( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.11( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.12( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.e( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.13( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.c( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.14( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.15( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.16( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.18( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1a( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.19( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.1b( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[3.17( empty local-lis/les=20/21 n=0 ec=35/20 lis/c=20/20 les/c/f=21/21/0 sis=35) [2,1,0] r=2 lpr=35 pi=[20,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1e( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.11( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.10( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.13( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.12( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.14( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.17( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.16( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.15( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.9( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.8( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.a( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.d( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.c( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.f( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.5( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.6( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.b( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.7( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.4( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.2( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.3( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.e( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1f( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1c( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1d( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.18( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1b( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.19( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1a( empty local-lis/les=21/22 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [4,0,5] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [4,5,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [4,1,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [4,5,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [1,2,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [2,1,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.c( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [2,0,1] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.10( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [2,0,5] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [2,0,1] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.14( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [5,4,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [5,2,0] r=2 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 35 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [5,0,2] r=1 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.9( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,1,4] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[31500]: osd.0 pg_epoch: 36 pg[2.1f( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [0,1,4] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.6( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,2,1] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.4( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,2,1] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.e( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.0( empty local-lis/les=35/36 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.1( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.19( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,5,2] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[2.1e( empty local-lis/les=35/36 n=0 ec=33/18 lis/c=33/33 les/c/f=34/34/0 sis=35) [3,4,5] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.f( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.6( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.8( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.5( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.a( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.17( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.9( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.14( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.16( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.15( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.12( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.10( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1e( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.d( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.13( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.2( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.b( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.c( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.3( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1f( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.7( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.19( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.e( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1b( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1d( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.4( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1c( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.1a( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.11( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:18 localhost ceph-osd[32440]: osd.3 pg_epoch: 36 pg[4.18( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=21/21 les/c/f=22/22/0 sis=35) [3,5,2] r=0 lpr=35 pi=[21,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:19 localhost ceph-osd[31500]: osd.0 pg_epoch: 37 pg[6.0( empty local-lis/les=28/29 n=0 ec=28/28 lis/c=28/28 les/c/f=29/29/0 sis=37 pruub=13.261057854s) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active pruub 1191.127685547s@ mbc={}] start_peering_interval up [0,5,2] -> [0,5,2], acting [0,5,2] -> [0,5,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:19 localhost ceph-osd[31500]: osd.0 pg_epoch: 37 pg[6.0( empty local-lis/les=28/29 n=0 ec=28/28 lis/c=28/28 les/c/f=29/29/0 sis=37 pruub=13.261057854s) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.127685547s@ mbc={}] state: transitioning to Primary Oct 14 04:12:19 localhost ceph-osd[32440]: osd.3 pg_epoch: 37 pg[5.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=37 pruub=8.165534019s) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 active pruub 1181.805541992s@ mbc={}] start_peering_interval up [2,3,1] -> [2,3,1], acting [2,3,1] -> [2,3,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:19 localhost ceph-osd[32440]: osd.3 pg_epoch: 37 pg[5.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=37 pruub=8.162278175s) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1181.805541992s@ mbc={}] state: transitioning to Stray Oct 14 04:12:19 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.4 scrub starts Oct 14 04:12:19 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 2.1f scrub starts Oct 14 04:12:19 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 2.1f scrub ok Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1e( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1c( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1f( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.12( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.13( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.11( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.10( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.16( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.17( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1d( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.14( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.15( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.b( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.a( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.9( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.5( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.8( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.7( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.4( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.6( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.2( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.d( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.c( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.f( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.e( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.19( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1b( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.3( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.18( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1a( empty local-lis/les=28/29 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.19( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.18( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1c( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1d( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1e( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1a( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.f( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.3( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.6( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.5( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.2( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.7( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.e( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.b( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.d( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.4( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.c( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.a( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.9( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.17( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.8( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.15( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.14( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.13( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.12( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1f( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.11( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.16( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.1b( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[32440]: osd.3 pg_epoch: 38 pg[5.10( empty local-lis/les=22/23 n=0 ec=37/22 lis/c=22/22 les/c/f=23/23/0 sis=37) [2,3,1] r=1 lpr=37 pi=[22,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.0( empty local-lis/les=37/38 n=0 ec=28/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.19( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1e( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1b( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.3( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.18( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1a( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.5( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.c( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.7( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.2( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.d( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.a( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1f( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.4( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.9( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.8( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.b( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.e( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.14( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.15( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.f( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.6( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.10( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.16( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.12( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1c( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.13( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.11( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.1d( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:20 localhost ceph-osd[31500]: osd.0 pg_epoch: 38 pg[6.17( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=28/28 les/c/f=29/29/0 sis=37) [0,5,2] r=0 lpr=37 pi=[28,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:21 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.1e scrub starts Oct 14 04:12:21 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.1e scrub ok Oct 14 04:12:21 localhost ceph-osd[32440]: osd.3 pg_epoch: 39 pg[7.0( v 31'39 (0'0,31'39] local-lis/les=29/30 n=22 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=12.525411606s) [5,4,3] r=2 lpr=39 pi=[29,39)/1 luod=0'0 lua=31'37 crt=31'39 lcod 31'38 mlcod 0'0 active pruub 1188.942871094s@ mbc={}] start_peering_interval up [5,4,3] -> [5,4,3], acting [5,4,3] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:21 localhost ceph-osd[32440]: osd.3 pg_epoch: 39 pg[7.0( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=12.523823738s) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 lcod 31'38 mlcod 0'0 unknown NOTIFY pruub 1188.942871094s@ mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.1( v 31'39 (0'0,31'39] local-lis/les=29/30 n=2 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:22 localhost ceph-osd[32440]: osd.3 pg_epoch: 40 pg[7.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=39/29 lis/c=29/29 les/c/f=30/30/0 sis=39) [5,4,3] r=2 lpr=39 pi=[29,39)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Oct 14 04:12:23 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.9 scrub starts Oct 14 04:12:23 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.9 scrub ok Oct 14 04:12:24 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts Oct 14 04:12:24 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok Oct 14 04:12:25 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.6 scrub starts Oct 14 04:12:25 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.6 scrub ok Oct 14 04:12:25 localhost python3[56480]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:12:26 localhost podman[56481]: 2025-10-14 08:12:26.748305631 +0000 UTC m=+0.087981660 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, container_name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team) Oct 14 04:12:26 localhost podman[56481]: 2025-10-14 08:12:26.954154741 +0000 UTC m=+0.293830790 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, release=1, version=17.1.9, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 04:12:26 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.812197685s) [4,5,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866333008s@ mbc={}] start_peering_interval up [2,1,0] -> [4,5,3], acting [2,1,0] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1a( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.811490059s) [4,3,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.865722656s@ mbc={}] start_peering_interval up [2,1,0] -> [4,3,1], acting [2,1,0] -> [4,3,1], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1f( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880626678s) [3,4,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934936523s@ mbc={}] start_peering_interval up [0,5,2] -> [3,4,1], acting [0,5,2] -> [3,4,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1a( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.811364174s) [4,3,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.865722656s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.812002182s) [4,5,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866333008s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1f( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880532265s) [3,4,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934936523s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.19( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810677528s) [0,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.865356445s@ mbc={}] start_peering_interval up [2,1,0] -> [0,1,2], acting [2,1,0] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.19( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810677528s) [0,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1200.865356445s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1c( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.881100655s) [4,3,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935791016s@ mbc={}] start_peering_interval up [0,5,2] -> [4,3,5], acting [0,5,2] -> [4,3,5], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1c( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.881028175s) [4,3,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935791016s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1d( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.881332397s) [3,4,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.936279297s@ mbc={}] start_peering_interval up [0,5,2] -> [3,4,5], acting [0,5,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1d( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.881295204s) [3,4,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.936279297s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810394287s) [3,4,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.865356445s@ mbc={}] start_peering_interval up [2,1,0] -> [3,4,1], acting [2,1,0] -> [3,4,1], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810354233s) [3,4,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.865356445s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1e( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879520416s) [4,5,3] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.933593750s@ mbc={}] start_peering_interval up [0,5,2] -> [4,5,3], acting [0,5,2] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880610466s) [4,1,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935668945s@ mbc={}] start_peering_interval up [0,5,2] -> [4,1,0], acting [0,5,2] -> [4,1,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.13( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880889893s) [3,4,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.936279297s@ mbc={}] start_peering_interval up [0,5,2] -> [3,4,5], acting [0,5,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.811059952s) [1,3,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866333008s@ mbc={}] start_peering_interval up [2,1,0] -> [1,3,4], acting [2,1,0] -> [1,3,4], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1e( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.878392220s) [4,5,3] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.933593750s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810394287s) [0,5,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.865722656s@ mbc={}] start_peering_interval up [2,1,0] -> [0,5,4], acting [2,1,0] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810394287s) [0,5,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1200.865722656s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.13( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880832672s) [3,4,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.936279297s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880496025s) [4,1,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935668945s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.811023712s) [1,3,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866333008s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.10( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879928589s) [0,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935546875s@ mbc={}] start_peering_interval up [0,5,2] -> [0,2,5], acting [0,5,2] -> [0,2,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.11( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880527496s) [3,4,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.936279297s@ mbc={}] start_peering_interval up [0,5,2] -> [3,4,1], acting [0,5,2] -> [3,4,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.16( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879952431s) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935668945s@ mbc={}] start_peering_interval up [0,5,2] -> [0,1,4], acting [0,5,2] -> [0,1,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.13( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.811300278s) [1,3,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867187500s@ mbc={}] start_peering_interval up [2,1,0] -> [1,3,2], acting [2,1,0] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.11( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.880465508s) [3,4,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.936279297s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.14( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810933113s) [1,2,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866699219s@ mbc={}] start_peering_interval up [2,1,0] -> [1,2,0], acting [2,1,0] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.16( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879952431s) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1194.935668945s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.13( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.811250687s) [1,3,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.867187500s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.14( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810636520s) [1,2,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866699219s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810376167s) [0,5,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866699219s@ mbc={}] start_peering_interval up [2,1,0] -> [0,5,4], acting [2,1,0] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.17( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879979134s) [4,0,1] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.936401367s@ mbc={}] start_peering_interval up [0,5,2] -> [4,0,1], acting [0,5,2] -> [4,0,1], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810376167s) [0,5,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1200.866699219s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.10( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879928589s) [0,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1194.935546875s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.14( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.878652573s) [3,5,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935058594s@ mbc={}] start_peering_interval up [0,5,2] -> [3,5,4], acting [0,5,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809930801s) [4,5,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866333008s@ mbc={}] start_peering_interval up [2,1,0] -> [4,5,0], acting [2,1,0] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.17( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.879895210s) [4,0,1] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.936401367s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809887886s) [4,5,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866333008s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.14( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.878595352s) [3,5,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935058594s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.10( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809736252s) [1,4,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866455078s@ mbc={}] start_peering_interval up [2,1,0] -> [1,4,3], acting [2,1,0] -> [1,4,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.15( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.878708839s) [5,4,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935424805s@ mbc={}] start_peering_interval up [0,5,2] -> [5,4,0], acting [0,5,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.10( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809669495s) [1,4,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866455078s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810736656s) [5,4,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867675781s@ mbc={}] start_peering_interval up [2,1,0] -> [5,4,0], acting [2,1,0] -> [5,4,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.15( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.878673553s) [5,4,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935424805s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810689926s) [5,4,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.867675781s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.a( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877969742s) [5,0,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934936523s@ mbc={}] start_peering_interval up [0,5,2] -> [5,0,4], acting [0,5,2] -> [5,0,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809835434s) [2,5,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866699219s@ mbc={}] start_peering_interval up [2,1,0] -> [2,5,0], acting [2,1,0] -> [2,5,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809800148s) [2,5,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866699219s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.a( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877907753s) [5,0,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934936523s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.d( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810348511s) [5,2,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867553711s@ mbc={}] start_peering_interval up [2,1,0] -> [5,2,3], acting [2,1,0] -> [5,2,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.8( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877741814s) [5,2,3] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935058594s@ mbc={}] start_peering_interval up [0,5,2] -> [5,2,3], acting [0,5,2] -> [5,2,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810262680s) [5,3,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867553711s@ mbc={}] start_peering_interval up [2,1,0] -> [5,3,4], acting [2,1,0] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.d( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810228348s) [5,2,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.867553711s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.8( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877709389s) [5,2,3] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935058594s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.b( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877722740s) [3,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935058594s@ mbc={}] start_peering_interval up [0,5,2] -> [3,1,4], acting [0,5,2] -> [3,1,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.810149193s) [5,3,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.867553711s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.b( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877593994s) [3,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935058594s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.9( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877406120s) [0,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934936523s@ mbc={}] start_peering_interval up [0,5,2] -> [0,2,5], acting [0,5,2] -> [0,2,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.5( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.876602173s) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934326172s@ mbc={}] start_peering_interval up [0,5,2] -> [5,2,0], acting [0,5,2] -> [5,2,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.9( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.877406120s) [0,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1194.934936523s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809152603s) [5,3,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866821289s@ mbc={}] start_peering_interval up [2,1,0] -> [5,3,4], acting [2,1,0] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.5( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.876527786s) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934326172s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809659958s) [0,5,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867553711s@ mbc={}] start_peering_interval up [2,1,0] -> [0,5,2], acting [2,1,0] -> [0,5,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809075356s) [5,3,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866821289s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809659958s) [0,5,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1200.867553711s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.2( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809311867s) [3,2,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867187500s@ mbc={}] start_peering_interval up [2,1,0] -> [3,2,5], acting [2,1,0] -> [3,2,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.4( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.876947403s) [3,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934936523s@ mbc={}] start_peering_interval up [0,5,2] -> [3,1,4], acting [0,5,2] -> [3,1,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.2( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.809275627s) [3,2,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.867187500s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.808588028s) [5,0,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866821289s@ mbc={}] start_peering_interval up [2,1,0] -> [5,0,4], acting [2,1,0] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.4( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.876817703s) [3,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934936523s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.7( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.875946045s) [5,3,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934326172s@ mbc={}] start_peering_interval up [0,5,2] -> [5,3,4], acting [0,5,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.7( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.875857353s) [5,3,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934326172s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.808465958s) [5,0,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866821289s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.6( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.876214027s) [3,5,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935546875s@ mbc={}] start_peering_interval up [0,5,2] -> [3,5,4], acting [0,5,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.6( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.876128197s) [3,5,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935546875s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.4( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.807239532s) [3,2,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866577148s@ mbc={}] start_peering_interval up [2,1,0] -> [3,2,1], acting [2,1,0] -> [3,2,1], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874656677s) [2,1,3] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934326172s@ mbc={}] start_peering_interval up [0,5,2] -> [2,1,3], acting [0,5,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.807065010s) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866699219s@ mbc={}] start_peering_interval up [2,1,0] -> [0,1,4], acting [2,1,0] -> [0,1,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.3( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874226570s) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934082031s@ mbc={}] start_peering_interval up [0,5,2] -> [5,2,0], acting [0,5,2] -> [5,2,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.4( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806760788s) [3,2,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866577148s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.807065010s) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1200.866699219s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874429703s) [2,1,3] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934326172s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.2( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874296188s) [1,3,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934448242s@ mbc={}] start_peering_interval up [0,5,2] -> [1,3,4], acting [0,5,2] -> [1,3,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806596756s) [4,0,5] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866699219s@ mbc={}] start_peering_interval up [2,1,0] -> [4,0,5], acting [2,1,0] -> [4,0,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.2( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874171257s) [1,3,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934448242s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806524277s) [4,0,5] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866699219s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.3( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874117851s) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934082031s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806664467s) [3,2,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866699219s@ mbc={}] start_peering_interval up [2,1,0] -> [3,2,1], acting [2,1,0] -> [3,2,1], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806282997s) [3,2,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866699219s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.d( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874011040s) [1,3,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934570312s@ mbc={}] start_peering_interval up [0,5,2] -> [1,3,2], acting [0,5,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806074142s) [4,1,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866577148s@ mbc={}] start_peering_interval up [2,1,0] -> [4,1,3], acting [2,1,0] -> [4,1,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.d( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.873959541s) [1,3,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934570312s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806906700s) [5,3,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.867675781s@ mbc={}] start_peering_interval up [2,1,0] -> [5,3,2], acting [2,1,0] -> [5,3,2], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806025505s) [4,1,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866577148s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.e( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874244690s) [5,3,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935058594s@ mbc={}] start_peering_interval up [0,5,2] -> [5,3,2], acting [0,5,2] -> [5,3,2], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806832314s) [5,3,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.867675781s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.e( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874194145s) [5,3,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935058594s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.c( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.873300552s) [3,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934326172s@ mbc={}] start_peering_interval up [0,5,2] -> [3,1,4], acting [0,5,2] -> [3,1,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.19( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872550964s) [5,3,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.933593750s@ mbc={}] start_peering_interval up [0,5,2] -> [5,3,4], acting [0,5,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.b( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.805617332s) [3,1,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866577148s@ mbc={}] start_peering_interval up [2,1,0] -> [3,1,4], acting [2,1,0] -> [3,1,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1c( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.804511070s) [5,3,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.865722656s@ mbc={}] start_peering_interval up [2,1,0] -> [5,3,2], acting [2,1,0] -> [5,3,2], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.19( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872511864s) [5,3,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.933593750s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.c( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.873167038s) [3,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934326172s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.b( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.805485725s) [3,1,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866577148s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1c( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.804476738s) [5,3,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.865722656s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.803957939s) [2,5,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.865356445s@ mbc={}] start_peering_interval up [2,1,0] -> [2,5,3], acting [2,1,0] -> [2,5,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.f( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.874012947s) [3,4,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.935424805s@ mbc={}] start_peering_interval up [0,5,2] -> [3,4,5], acting [0,5,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.803921700s) [2,5,3] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.865356445s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.18( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872651100s) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934082031s@ mbc={}] start_peering_interval up [0,5,2] -> [0,1,4], acting [0,5,2] -> [0,1,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1b( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871747971s) [2,1,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.933715820s@ mbc={}] start_peering_interval up [0,5,2] -> [2,1,0], acting [0,5,2] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.18( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872651100s) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1194.934082031s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.f( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.873546600s) [3,4,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.935424805s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1b( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871669769s) [2,1,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.933715820s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.804007530s) [3,2,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866210938s@ mbc={}] start_peering_interval up [2,1,0] -> [3,2,5], acting [2,1,0] -> [3,2,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1a( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871774673s) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1194.934082031s@ mbc={}] start_peering_interval up [0,5,2] -> [5,2,0], acting [0,5,2] -> [5,2,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.803868294s) [3,2,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1200.866210938s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[6.1a( empty local-lis/les=37/38 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871729851s) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1194.934082031s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.803479195s) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1200.866333008s@ mbc={}] start_peering_interval up [2,1,0] -> [0,1,4], acting [2,1,0] -> [0,1,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.803479195s) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1200.866333008s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.a( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,4,1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,5,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.807618141s) [2,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621704102s@ mbc={}] start_peering_interval up [3,5,2] -> [2,3,1], acting [3,5,2] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872303963s) [1,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.686523438s@ mbc={}] start_peering_interval up [2,3,1] -> [1,2,3], acting [2,3,1] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.807479858s) [2,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621704102s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872169495s) [1,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.686523438s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,1,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871683121s) [2,1,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.686523438s@ mbc={}] start_peering_interval up [2,3,1] -> [2,1,3], acting [2,3,1] -> [2,1,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806840897s) [2,1,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621826172s@ mbc={}] start_peering_interval up [3,5,2] -> [2,1,3], acting [3,5,2] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.1f( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806685448s) [2,1,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621826172s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806315422s) [2,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621704102s@ mbc={}] start_peering_interval up [3,5,2] -> [2,3,1], acting [3,5,2] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.806265831s) [2,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621704102s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,5,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.1d( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871606827s) [2,1,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.686523438s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,4,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.1e( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,5,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,5,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,4,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872975349s) [3,5,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.689941406s@ mbc={}] start_peering_interval up [2,3,1] -> [3,5,2], acting [2,3,1] -> [3,5,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.872975349s) [3,5,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.689941406s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.4( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871656418s) [1,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.689941406s@ mbc={}] start_peering_interval up [2,3,1] -> [1,3,4], acting [2,3,1] -> [1,3,4], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.871606827s) [1,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.689941406s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.2( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.b( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.869995117s) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691162109s@ mbc={}] start_peering_interval up [2,3,1] -> [3,1,4], acting [2,3,1] -> [3,1,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.869995117s) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.691162109s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.7( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.868842125s) [1,2,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690795898s@ mbc={}] start_peering_interval up [2,3,1] -> [1,2,0], acting [2,3,1] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.868775368s) [1,2,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690795898s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.b( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.868238449s) [3,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690795898s@ mbc={}] start_peering_interval up [2,3,1] -> [3,2,5], acting [2,3,1] -> [3,2,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.868238449s) [3,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.690795898s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.16( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.14( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,5,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.11( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.13( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865918159s) [1,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691162109s@ mbc={}] start_peering_interval up [2,3,1] -> [1,2,3], acting [2,3,1] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865801811s) [1,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691162109s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865532875s) [5,4,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691284180s@ mbc={}] start_peering_interval up [2,3,1] -> [5,4,0], acting [2,3,1] -> [5,4,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865479469s) [5,4,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691284180s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.794961929s) [0,5,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.620849609s@ mbc={}] start_peering_interval up [3,5,2] -> [0,5,2], acting [3,5,2] -> [0,5,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865249634s) [4,5,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691284180s@ mbc={}] start_peering_interval up [2,3,1] -> [4,5,3], acting [2,3,1] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.794714928s) [3,4,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.620849609s@ mbc={}] start_peering_interval up [3,5,2] -> [3,4,5], acting [3,5,2] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865118980s) [4,5,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691284180s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.794714928s) [3,4,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1196.620849609s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.865569115s) [1,2,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691284180s@ mbc={}] start_peering_interval up [2,3,1] -> [1,2,0], acting [2,3,1] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.864461899s) [4,0,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690917969s@ mbc={}] start_peering_interval up [2,3,1] -> [4,0,1], acting [2,3,1] -> [4,0,1], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.864388466s) [4,0,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690917969s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.794207573s) [5,2,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.620727539s@ mbc={}] start_peering_interval up [3,5,2] -> [5,2,3], acting [3,5,2] -> [5,2,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.863971710s) [3,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690917969s@ mbc={}] start_peering_interval up [2,3,1] -> [3,2,5], acting [2,3,1] -> [3,2,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.863971710s) [3,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.690917969s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.793963432s) [5,2,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.620727539s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.792289734s) [4,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619750977s@ mbc={}] start_peering_interval up [3,5,2] -> [4,3,1], acting [3,5,2] -> [4,3,1], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.792537689s) [0,4,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619995117s@ mbc={}] start_peering_interval up [3,5,2] -> [0,4,1], acting [3,5,2] -> [0,4,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.792184830s) [4,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619750977s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.792331696s) [0,4,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619995117s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862846375s) [0,4,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690429688s@ mbc={}] start_peering_interval up [2,3,1] -> [0,4,1], acting [2,3,1] -> [0,4,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.791593552s) [4,0,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619628906s@ mbc={}] start_peering_interval up [3,5,2] -> [4,0,5], acting [3,5,2] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862652779s) [0,4,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690429688s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862999916s) [5,3,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690917969s@ mbc={}] start_peering_interval up [2,3,1] -> [5,3,2], acting [2,3,1] -> [5,3,2], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862871170s) [5,3,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690917969s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.791541100s) [4,0,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619628906s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.863426208s) [1,2,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691284180s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,4,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862985611s) [5,3,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691284180s@ mbc={}] start_peering_interval up [2,3,1] -> [5,3,2], acting [2,3,1] -> [5,3,2], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862930298s) [5,3,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691284180s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.790843964s) [3,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619506836s@ mbc={}] start_peering_interval up [3,5,2] -> [3,1,2], acting [3,5,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.790891647s) [0,1,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619750977s@ mbc={}] start_peering_interval up [3,5,2] -> [0,1,2], acting [3,5,2] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.790843964s) [3,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1196.619506836s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.790843964s) [0,1,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619750977s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862078667s) [2,0,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691162109s@ mbc={}] start_peering_interval up [2,3,1] -> [2,0,1], acting [2,3,1] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.862035751s) [2,0,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691162109s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857095718s) [0,5,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.686401367s@ mbc={}] start_peering_interval up [2,3,1] -> [0,5,2], acting [2,3,1] -> [0,5,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857045174s) [0,5,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.686401367s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.860488892s) [0,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.689941406s@ mbc={}] start_peering_interval up [2,3,1] -> [0,1,4], acting [2,3,1] -> [0,1,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.860453606s) [0,1,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.689941406s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.792387009s) [0,5,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.620849609s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.856534004s) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.686401367s@ mbc={}] start_peering_interval up [2,3,1] -> [3,1,4], acting [2,3,1] -> [3,1,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.856534004s) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.686401367s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.860051155s) [0,1,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690063477s@ mbc={}] start_peering_interval up [2,3,1] -> [0,1,2], acting [2,3,1] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.859985352s) [0,1,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690063477s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.861444473s) [1,0,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.691528320s@ mbc={}] start_peering_interval up [2,3,1] -> [1,0,4], acting [2,3,1] -> [1,0,4], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.791361809s) [1,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621704102s@ mbc={}] start_peering_interval up [3,5,2] -> [1,3,2], acting [3,5,2] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.791291237s) [1,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621704102s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.861214638s) [1,0,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.691528320s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787593842s) [2,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619262695s@ mbc={}] start_peering_interval up [3,5,2] -> [2,3,1], acting [3,5,2] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787835121s) [4,0,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619506836s@ mbc={}] start_peering_interval up [3,5,2] -> [4,0,1], acting [3,5,2] -> [4,0,1], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787425995s) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619262695s@ mbc={}] start_peering_interval up [3,5,2] -> [4,5,3], acting [3,5,2] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787384987s) [2,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619262695s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787371635s) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619262695s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.789340973s) [0,1,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621337891s@ mbc={}] start_peering_interval up [3,5,2] -> [0,1,4], acting [3,5,2] -> [0,1,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.789671898s) [2,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621459961s@ mbc={}] start_peering_interval up [3,5,2] -> [2,5,3], acting [3,5,2] -> [2,5,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787229538s) [1,0,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619506836s@ mbc={}] start_peering_interval up [3,5,2] -> [1,0,2], acting [3,5,2] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787098885s) [1,0,2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619506836s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.789215088s) [0,1,4] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621337891s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857454300s) [2,0,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690307617s@ mbc={}] start_peering_interval up [2,3,1] -> [2,0,5], acting [2,3,1] -> [2,0,5], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857383728s) [2,0,5] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690307617s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.788196564s) [1,4,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621093750s@ mbc={}] start_peering_interval up [3,5,2] -> [1,4,3], acting [3,5,2] -> [1,4,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.786919594s) [4,0,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619506836s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.788055420s) [1,4,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621093750s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857270241s) [2,5,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690307617s@ mbc={}] start_peering_interval up [2,3,1] -> [2,5,0], acting [2,3,1] -> [2,5,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.789155006s) [2,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621459961s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.856881142s) [4,0,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690185547s@ mbc={}] start_peering_interval up [2,3,1] -> [4,0,1], acting [2,3,1] -> [4,0,1], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857108116s) [2,5,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690307617s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.856831551s) [4,0,1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690185547s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785413742s) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619018555s@ mbc={}] start_peering_interval up [3,5,2] -> [3,2,5], acting [3,5,2] -> [3,2,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857278824s) [4,3,5] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690917969s@ mbc={}] start_peering_interval up [2,3,1] -> [4,3,5], acting [2,3,1] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.857237816s) [4,3,5] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690917969s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785413742s) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1196.619018555s@ mbc={}] state: transitioning to Primary Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787257195s) [5,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621337891s@ mbc={}] start_peering_interval up [3,5,2] -> [5,3,4], acting [3,5,2] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.787203789s) [5,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621337891s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.855983734s) [5,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690429688s@ mbc={}] start_peering_interval up [2,3,1] -> [5,3,4], acting [2,3,1] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.855928421s) [5,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690429688s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784215927s) [1,2,0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.619384766s@ mbc={}] start_peering_interval up [3,5,2] -> [1,2,0], acting [3,5,2] -> [1,2,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.786221504s) [4,1,0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621459961s@ mbc={}] start_peering_interval up [3,5,2] -> [4,1,0], acting [3,5,2] -> [4,1,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784075737s) [1,2,0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.619384766s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.855365753s) [5,0,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690673828s@ mbc={}] start_peering_interval up [2,3,1] -> [5,0,2], acting [2,3,1] -> [5,0,2], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.855234146s) [5,0,2] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690673828s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.786059380s) [0,2,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621704102s@ mbc={}] start_peering_interval up [3,5,2] -> [0,2,5], acting [3,5,2] -> [0,2,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.786004066s) [0,2,5] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621704102s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.786011696s) [0,4,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621826172s@ mbc={}] start_peering_interval up [3,5,2] -> [0,4,1], acting [3,5,2] -> [0,4,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785951614s) [0,4,1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621826172s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.786148071s) [4,1,0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621459961s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.850193024s) [0,5,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.686401367s@ mbc={}] start_peering_interval up [2,3,1] -> [0,5,4], acting [2,3,1] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.850145340s) [0,5,4] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.686401367s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784661293s) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621215820s@ mbc={}] start_peering_interval up [3,5,2] -> [4,5,3], acting [3,5,2] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785603523s) [1,4,0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621704102s@ mbc={}] start_peering_interval up [3,5,2] -> [1,4,0], acting [3,5,2] -> [1,4,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784621239s) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621215820s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785085678s) [1,4,0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621704102s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.849960327s) [5,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.686767578s@ mbc={}] start_peering_interval up [2,3,1] -> [5,2,3], acting [2,3,1] -> [5,2,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.853155136s) [5,2,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active pruub 1190.690063477s@ mbc={}] start_peering_interval up [2,3,1] -> [5,2,0], acting [2,3,1] -> [5,2,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.849905014s) [5,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.686767578s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=8.853117943s) [5,2,0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.690063477s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784959793s) [5,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.622070312s@ mbc={}] start_peering_interval up [3,5,2] -> [5,3,4], acting [3,5,2] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785409927s) [5,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.622558594s@ mbc={}] start_peering_interval up [3,5,2] -> [5,3,2], acting [3,5,2] -> [5,3,2], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.785360336s) [5,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.622558594s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784052849s) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active pruub 1196.621704102s@ mbc={}] start_peering_interval up [3,5,2] -> [4,5,3], acting [3,5,2] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.783980370s) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.621704102s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/36 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=14.784212112s) [5,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1196.622070312s@ mbc={}] state: transitioning to Stray Oct 14 04:12:27 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.e scrub starts Oct 14 04:12:27 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.e scrub ok Oct 14 04:12:27 localhost python3[56526]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [1,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.5( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [1,2,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [1,0,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [1,4,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [1,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.b( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [2,0,5] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.13( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [1,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.8( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [2,0,1] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [1,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.d( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [2,5,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.1b( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [1,0,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [1,3,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [1,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.10( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [1,4,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.10( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,4,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.1( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [2,1,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [2,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,0,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.1c( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,2,0] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [5,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.d( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [5,2,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,2,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [5,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,3,2] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [5,3,4] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.1c( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [5,3,2] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.19( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [5,3,4] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.1a( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [4,3,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[4.16( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,1,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[3.19( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.1c( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [4,3,5] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[6.1e( empty local-lis/les=0/0 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [4,5,3] r=2 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[4.1e( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,5,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[6.10( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[6.9( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,5,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [4,5,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,5,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [4,1,3] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[3.4( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[5.17( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,2,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[3.2( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[5.6( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,5,2] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,2,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[4.17( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,1,2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.1d( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,4,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [4,0,5] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.13( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.14( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,5,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.e( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [4,0,1] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [4,0,1] r=1 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [4,1,0] r=2 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.1f( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=0/0 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [4,0,1] r=1 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.4( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.6( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,5,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,5,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,5,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[6.16( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[6.18( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,5,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,4,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[5.a( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [0,4,1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[4.b( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.11( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.f( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,4,5] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/22 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,4,1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[31500]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [0,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.b( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[3.b( empty local-lis/les=41/42 n=0 ec=35/20 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,1,4] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[6.c( empty local-lis/les=41/42 n=0 ec=37/28 lis/c=37/37 les/c/f=38/38/0 sis=41) [3,1,4] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:28 localhost ceph-osd[32440]: osd.3 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/21 lis/c=35/35 les/c/f=36/36/0 sis=41) [3,4,5] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.857314110s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1192.734985352s@ mbc={}] start_peering_interval up [5,4,3] -> [3,2,1], acting [5,4,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.857314110s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1192.734985352s@ mbc={}] state: transitioning to Primary Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.856261253s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1192.734497070s@ mbc={}] start_peering_interval up [5,4,3] -> [3,2,1], acting [5,4,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.856261253s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1192.734497070s@ mbc={}] state: transitioning to Primary Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.2( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.855790138s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1192.734130859s@ mbc={}] start_peering_interval up [5,4,3] -> [3,2,1], acting [5,4,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.856481552s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1192.734863281s@ mbc={}] start_peering_interval up [5,4,3] -> [3,2,1], acting [5,4,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.2( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.855790138s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1192.734130859s@ mbc={}] state: transitioning to Primary Oct 14 04:12:29 localhost ceph-osd[32440]: osd.3 pg_epoch: 43 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=8.856481552s) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1192.734863281s@ mbc={}] state: transitioning to Primary Oct 14 04:12:29 localhost python3[56542]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:30 localhost ceph-osd[32440]: osd.3 pg_epoch: 44 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:30 localhost ceph-osd[32440]: osd.3 pg_epoch: 44 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:30 localhost ceph-osd[32440]: osd.3 pg_epoch: 44 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:30 localhost ceph-osd[32440]: osd.3 pg_epoch: 44 pg[7.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,2,1] r=0 lpr=43 pi=[39,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:31 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.19 scrub starts Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.819221497s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1200.734985352s@ mbc={}] start_peering_interval up [5,4,3] -> [3,1,4], acting [5,4,3] -> [3,1,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.819221497s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1200.734985352s@ mbc={}] state: transitioning to Primary Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.818769455s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1200.734863281s@ mbc={}] start_peering_interval up [5,4,3] -> [3,1,4], acting [5,4,3] -> [3,1,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.818769455s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1200.734863281s@ mbc={}] state: transitioning to Primary Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.819266319s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1200.735961914s@ mbc={}] start_peering_interval up [5,4,3] -> [3,1,4], acting [5,4,3] -> [3,1,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.819266319s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1200.735961914s@ mbc={}] state: transitioning to Primary Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.818964005s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1200.735595703s@ mbc={}] start_peering_interval up [5,4,3] -> [3,1,4], acting [5,4,3] -> [3,1,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:31 localhost ceph-osd[32440]: osd.3 pg_epoch: 45 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=14.818964005s) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1200.735595703s@ mbc={}] state: transitioning to Primary Oct 14 04:12:32 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.0 scrub starts Oct 14 04:12:32 localhost ceph-osd[32440]: osd.3 pg_epoch: 46 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:32 localhost ceph-osd[32440]: osd.3 pg_epoch: 46 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=3}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:32 localhost ceph-osd[32440]: osd.3 pg_epoch: 46 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:32 localhost ceph-osd[32440]: osd.3 pg_epoch: 46 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=45/46 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=45) [3,1,4] r=0 lpr=45 pi=[39,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:32 localhost python3[56591]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:12:33 localhost python3[56634]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429552.5806885-92638-165256650867266/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=0991400062f1e3522feec6859340320816889889 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:34 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.19 scrub starts Oct 14 04:12:35 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.10 scrub starts Oct 14 04:12:37 localhost ceph-osd[31500]: osd.0 pg_epoch: 47 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47) [0,1,2] r=0 lpr=47 pi=[39,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:37 localhost ceph-osd[31500]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47) [0,1,2] r=0 lpr=47 pi=[39,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:37 localhost ceph-osd[32440]: osd.3 pg_epoch: 47 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47 pruub=9.228775978s) [0,1,2] r=-1 lpr=47 pi=[39,47)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1200.734985352s@ mbc={}] start_peering_interval up [5,4,3] -> [0,1,2], acting [5,4,3] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:37 localhost ceph-osd[32440]: osd.3 pg_epoch: 47 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47 pruub=9.228672981s) [0,1,2] r=-1 lpr=47 pi=[39,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1200.734985352s@ mbc={}] state: transitioning to Stray Oct 14 04:12:37 localhost ceph-osd[32440]: osd.3 pg_epoch: 47 pg[7.4( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47 pruub=9.229033470s) [0,1,2] r=-1 lpr=47 pi=[39,47)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1200.735351562s@ mbc={}] start_peering_interval up [5,4,3] -> [0,1,2], acting [5,4,3] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:37 localhost ceph-osd[32440]: osd.3 pg_epoch: 47 pg[7.4( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47 pruub=9.228783607s) [0,1,2] r=-1 lpr=47 pi=[39,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1200.735351562s@ mbc={}] state: transitioning to Stray Oct 14 04:12:38 localhost ceph-osd[31500]: osd.0 pg_epoch: 48 pg[7.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/48 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47) [0,1,2] r=0 lpr=47 pi=[39,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:38 localhost ceph-osd[31500]: osd.0 pg_epoch: 48 pg[7.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/48 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=47) [0,1,2] r=0 lpr=47 pi=[39,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(0+3)=4}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:38 localhost python3[56696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:12:38 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.0 scrub starts Oct 14 04:12:38 localhost python3[56739]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429557.9017637-92638-157424959801724/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=ba6c47c4b62a1635e77f10e9e003b0ff16f31619 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:38 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.6 scrub starts Oct 14 04:12:38 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.6 scrub ok Oct 14 04:12:38 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.0 scrub ok Oct 14 04:12:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:12:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4120 writes, 19K keys, 4120 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4120 writes, 290 syncs, 14.21 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 862 writes, 3363 keys, 862 commit groups, 1.0 writes per commit group, ingest: 1.43 MB, 0.00 MB/s#012Interval WAL: 862 writes, 145 syncs, 5.94 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Oct 14 04:12:42 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.11 scrub starts Oct 14 04:12:42 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.11 scrub ok Oct 14 04:12:43 localhost ceph-osd[32440]: osd.3 pg_epoch: 49 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=49 pruub=11.079285622s) [4,0,1] r=-1 lpr=49 pi=[39,49)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1208.734375000s@ mbc={}] start_peering_interval up [5,4,3] -> [4,0,1], acting [5,4,3] -> [4,0,1], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:43 localhost ceph-osd[32440]: osd.3 pg_epoch: 49 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=39/40 n=2 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=49 pruub=11.079186440s) [4,0,1] r=-1 lpr=49 pi=[39,49)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1208.734375000s@ mbc={}] state: transitioning to Stray Oct 14 04:12:43 localhost ceph-osd[32440]: osd.3 pg_epoch: 49 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=49 pruub=11.079216003s) [4,0,1] r=-1 lpr=49 pi=[39,49)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1208.734375000s@ mbc={}] start_peering_interval up [5,4,3] -> [4,0,1], acting [5,4,3] -> [4,0,1], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:43 localhost ceph-osd[32440]: osd.3 pg_epoch: 49 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=49 pruub=11.078792572s) [4,0,1] r=-1 lpr=49 pi=[39,49)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1208.734375000s@ mbc={}] state: transitioning to Stray Oct 14 04:12:43 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.f scrub starts Oct 14 04:12:43 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.f scrub ok Oct 14 04:12:43 localhost python3[56801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:12:43 localhost python3[56844]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429563.1997614-92638-41278663095220/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=2a2148c4af133c419b7d1e891437641895bee05f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:44 localhost ceph-osd[31500]: osd.0 pg_epoch: 49 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=49) [4,0,1] r=1 lpr=49 pi=[39,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:44 localhost ceph-osd[31500]: osd.0 pg_epoch: 49 pg[7.d( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=49) [4,0,1] r=1 lpr=49 pi=[39,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:12:45 localhost ceph-osd[32440]: osd.3 pg_epoch: 51 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=9.558267593s) [0,1,4] r=-1 lpr=51 pi=[43,51)/1 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1209.238769531s@ mbc={255={}}] start_peering_interval up [3,2,1] -> [0,1,4], acting [3,2,1] -> [0,1,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:45 localhost ceph-osd[32440]: osd.3 pg_epoch: 51 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=9.558148384s) [0,1,4] r=-1 lpr=51 pi=[43,51)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1209.238769531s@ mbc={}] state: transitioning to Stray Oct 14 04:12:45 localhost ceph-osd[32440]: osd.3 pg_epoch: 51 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=9.558197975s) [0,1,4] r=-1 lpr=51 pi=[43,51)/1 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1209.238891602s@ mbc={255={}}] start_peering_interval up [3,2,1] -> [0,1,4], acting [3,2,1] -> [0,1,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:45 localhost ceph-osd[32440]: osd.3 pg_epoch: 51 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=9.558073044s) [0,1,4] r=-1 lpr=51 pi=[43,51)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1209.238891602s@ mbc={}] state: transitioning to Stray Oct 14 04:12:45 localhost ceph-osd[31500]: osd.0 pg_epoch: 51 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51) [0,1,4] r=0 lpr=51 pi=[43,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:45 localhost ceph-osd[31500]: osd.0 pg_epoch: 51 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51) [0,1,4] r=0 lpr=51 pi=[43,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:12:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4652 writes, 21K keys, 4652 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4652 writes, 380 syncs, 12.24 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1370 writes, 5205 keys, 1370 commit groups, 1.0 writes per commit group, ingest: 1.99 MB, 0.00 MB/s#012Interval WAL: 1370 writes, 225 syncs, 6.09 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 m Oct 14 04:12:45 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.9 scrub starts Oct 14 04:12:45 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.9 scrub ok Oct 14 04:12:46 localhost ceph-osd[31500]: osd.0 pg_epoch: 52 pg[7.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=51/52 n=2 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51) [0,1,4] r=0 lpr=51 pi=[43,51)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:46 localhost ceph-osd[31500]: osd.0 pg_epoch: 52 pg[7.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=51/52 n=1 ec=39/29 lis/c=43/43 les/c/f=44/46/0 sis=51) [0,1,4] r=0 lpr=51 pi=[43,51)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:12:46 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.1 scrub starts Oct 14 04:12:47 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.1 scrub ok Oct 14 04:12:47 localhost ceph-osd[32440]: osd.3 pg_epoch: 53 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=9.196358681s) [1,4,3] r=2 lpr=53 pi=[45,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1210.952148438s@ mbc={255={}}] start_peering_interval up [3,1,4] -> [1,4,3], acting [3,1,4] -> [1,4,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:47 localhost ceph-osd[32440]: osd.3 pg_epoch: 53 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=9.196192741s) [1,4,3] r=2 lpr=53 pi=[45,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1210.952148438s@ mbc={}] state: transitioning to Stray Oct 14 04:12:47 localhost ceph-osd[32440]: osd.3 pg_epoch: 53 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=9.188833237s) [1,4,3] r=2 lpr=53 pi=[45,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1210.944946289s@ mbc={255={}}] start_peering_interval up [3,1,4] -> [1,4,3], acting [3,1,4] -> [1,4,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:47 localhost ceph-osd[32440]: osd.3 pg_epoch: 53 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=9.188641548s) [1,4,3] r=2 lpr=53 pi=[45,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1210.944946289s@ mbc={}] state: transitioning to Stray Oct 14 04:12:47 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.7 scrub starts Oct 14 04:12:47 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.7 scrub ok Oct 14 04:12:48 localhost python3[56906]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:12:48 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.4 scrub starts Oct 14 04:12:48 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.4 scrub ok Oct 14 04:12:48 localhost python3[56951]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429567.8972337-93152-246457790564947/source _original_basename=tmphesijjf1 follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:49 localhost python3[57013]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:12:50 localhost python3[57056]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429569.538499-93268-84260235500463/source _original_basename=tmpyb67iotn follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:12:50 localhost ceph-osd[32440]: osd.3 pg_epoch: 55 pg[7.8( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=55 pruub=11.926024437s) [3,1,2] r=0 lpr=55 pi=[39,55)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1216.735351562s@ mbc={}] start_peering_interval up [5,4,3] -> [3,1,2], acting [5,4,3] -> [3,1,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:50 localhost ceph-osd[32440]: osd.3 pg_epoch: 55 pg[7.8( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=55 pruub=11.926024437s) [3,1,2] r=0 lpr=55 pi=[39,55)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1216.735351562s@ mbc={}] state: transitioning to Primary Oct 14 04:12:50 localhost python3[57086]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Oct 14 04:12:51 localhost python3[57104]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:12:51 localhost ceph-osd[31500]: osd.0 pg_epoch: 56 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=56) [0,4,5] r=0 lpr=56 pi=[39,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:12:51 localhost ceph-osd[32440]: osd.3 pg_epoch: 56 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=56 pruub=10.886432648s) [0,4,5] r=-1 lpr=56 pi=[39,56)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1216.735961914s@ mbc={}] start_peering_interval up [5,4,3] -> [0,4,5], acting [5,4,3] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:51 localhost ceph-osd[32440]: osd.3 pg_epoch: 56 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=39/40 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=56 pruub=10.886179924s) [0,4,5] r=-1 lpr=56 pi=[39,56)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1216.735961914s@ mbc={}] state: transitioning to Stray Oct 14 04:12:51 localhost ceph-osd[32440]: osd.3 pg_epoch: 56 pg[7.8( v 31'39 (0'0,31'39] local-lis/les=55/56 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=55) [3,1,2] r=0 lpr=55 pi=[39,55)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:52 localhost ceph-osd[31500]: osd.0 pg_epoch: 57 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=56/57 n=1 ec=39/29 lis/c=39/39 les/c/f=40/40/0 sis=56) [0,4,5] r=0 lpr=56 pi=[39,56)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Oct 14 04:12:52 localhost ansible-async_wrapper.py[57276]: Invoked with 222019460660 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429572.28814-93371-164766603742499/AnsiballZ_command.py _ Oct 14 04:12:52 localhost ansible-async_wrapper.py[57279]: Starting module and watcher Oct 14 04:12:52 localhost ansible-async_wrapper.py[57279]: Start watching 57280 (3600) Oct 14 04:12:52 localhost ansible-async_wrapper.py[57280]: Start module (57280) Oct 14 04:12:52 localhost ansible-async_wrapper.py[57276]: Return async_wrapper task started. Oct 14 04:12:53 localhost python3[57298]: ansible-ansible.legacy.async_status Invoked with jid=222019460660.57276 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:12:56 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.19 scrub starts Oct 14 04:12:56 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.19 scrub ok Oct 14 04:12:56 localhost puppet-user[57300]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:12:56 localhost puppet-user[57300]: (file: /etc/puppet/hiera.yaml) Oct 14 04:12:56 localhost puppet-user[57300]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:12:56 localhost puppet-user[57300]: (file & line not available) Oct 14 04:12:56 localhost puppet-user[57300]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:12:56 localhost puppet-user[57300]: (file & line not available) Oct 14 04:12:56 localhost puppet-user[57300]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Oct 14 04:12:56 localhost puppet-user[57300]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Oct 14 04:12:56 localhost puppet-user[57300]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.12 seconds Oct 14 04:12:56 localhost puppet-user[57300]: Notice: Applied catalog in 0.04 seconds Oct 14 04:12:56 localhost puppet-user[57300]: Application: Oct 14 04:12:56 localhost puppet-user[57300]: Initial environment: production Oct 14 04:12:56 localhost puppet-user[57300]: Converged environment: production Oct 14 04:12:56 localhost puppet-user[57300]: Run mode: user Oct 14 04:12:56 localhost puppet-user[57300]: Changes: Oct 14 04:12:56 localhost puppet-user[57300]: Events: Oct 14 04:12:56 localhost puppet-user[57300]: Resources: Oct 14 04:12:56 localhost puppet-user[57300]: Total: 10 Oct 14 04:12:56 localhost puppet-user[57300]: Time: Oct 14 04:12:56 localhost puppet-user[57300]: Schedule: 0.00 Oct 14 04:12:56 localhost puppet-user[57300]: File: 0.00 Oct 14 04:12:56 localhost puppet-user[57300]: Exec: 0.01 Oct 14 04:12:56 localhost puppet-user[57300]: Augeas: 0.01 Oct 14 04:12:56 localhost puppet-user[57300]: Transaction evaluation: 0.03 Oct 14 04:12:56 localhost puppet-user[57300]: Catalog application: 0.04 Oct 14 04:12:56 localhost puppet-user[57300]: Config retrieval: 0.15 Oct 14 04:12:56 localhost puppet-user[57300]: Last run: 1760429576 Oct 14 04:12:56 localhost puppet-user[57300]: Filebucket: 0.00 Oct 14 04:12:56 localhost puppet-user[57300]: Total: 0.04 Oct 14 04:12:56 localhost puppet-user[57300]: Version: Oct 14 04:12:56 localhost puppet-user[57300]: Config: 1760429576 Oct 14 04:12:56 localhost puppet-user[57300]: Puppet: 7.10.0 Oct 14 04:12:57 localhost ansible-async_wrapper.py[57280]: Module complete (57280) Oct 14 04:12:57 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.17 scrub starts Oct 14 04:12:57 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.17 scrub ok Oct 14 04:12:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:12:57 localhost podman[57411]: 2025-10-14 08:12:57.73464922 +0000 UTC m=+0.073612960 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:12:57 localhost ansible-async_wrapper.py[57279]: Done in kid B. Oct 14 04:12:57 localhost podman[57411]: 2025-10-14 08:12:57.922939182 +0000 UTC m=+0.261902842 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, release=1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:12:57 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:12:59 localhost ceph-osd[32440]: osd.3 pg_epoch: 58 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=39/29 lis/c=43/43 les/c/f=44/44/0 sis=58 pruub=11.648591042s) [4,0,5] r=-1 lpr=58 pi=[43,58)/1 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1225.236206055s@ mbc={}] start_peering_interval up [3,2,1] -> [4,0,5], acting [3,2,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:12:59 localhost ceph-osd[32440]: osd.3 pg_epoch: 58 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=39/29 lis/c=43/43 les/c/f=44/44/0 sis=58 pruub=11.648453712s) [4,0,5] r=-1 lpr=58 pi=[43,58)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1225.236206055s@ mbc={}] state: transitioning to Stray Oct 14 04:13:00 localhost ceph-osd[31500]: osd.0 pg_epoch: 58 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=43/43 les/c/f=44/44/0 sis=58) [4,0,5] r=1 lpr=58 pi=[43,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:13:01 localhost ceph-osd[32440]: osd.3 pg_epoch: 60 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=60 pruub=11.216345787s) [3,1,2] r=0 lpr=60 pi=[45,60)/1 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1226.952758789s@ mbc={255={}}] start_peering_interval up [3,1,4] -> [3,1,2], acting [3,1,4] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:13:01 localhost ceph-osd[32440]: osd.3 pg_epoch: 60 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=60 pruub=11.216345787s) [3,1,2] r=0 lpr=60 pi=[45,60)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown pruub 1226.952758789s@ mbc={}] state: transitioning to Primary Oct 14 04:13:01 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.3 scrub starts Oct 14 04:13:01 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.3 scrub ok Oct 14 04:13:02 localhost ceph-osd[32440]: osd.3 pg_epoch: 61 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=60/61 n=1 ec=39/29 lis/c=45/45 les/c/f=46/46/0 sis=60) [3,1,2] r=0 lpr=60 pi=[45,60)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:13:03 localhost ceph-osd[31500]: osd.0 pg_epoch: 62 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=47/48 n=1 ec=39/29 lis/c=47/47 les/c/f=48/48/0 sis=62 pruub=14.765545845s) [1,3,4] r=-1 lpr=62 pi=[47,62)/1 crt=31'39 mlcod 31'39 active pruub 1236.774658203s@ mbc={255={}}] start_peering_interval up [0,1,2] -> [1,3,4], acting [0,1,2] -> [1,3,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:13:03 localhost ceph-osd[31500]: osd.0 pg_epoch: 62 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=47/48 n=1 ec=39/29 lis/c=47/47 les/c/f=48/48/0 sis=62 pruub=14.765382767s) [1,3,4] r=-1 lpr=62 pi=[47,62)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1236.774658203s@ mbc={}] state: transitioning to Stray Oct 14 04:13:03 localhost python3[57583]: ansible-ansible.legacy.async_status Invoked with jid=222019460660.57276 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:13:04 localhost python3[57599]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:13:04 localhost ceph-osd[32440]: osd.3 pg_epoch: 62 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=47/47 les/c/f=48/48/0 sis=62) [1,3,4] r=1 lpr=62 pi=[47,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:13:04 localhost python3[57615]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:13:04 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.1f scrub starts Oct 14 04:13:04 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.1f scrub ok Oct 14 04:13:05 localhost python3[57665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:05 localhost ceph-osd[31500]: osd.0 pg_epoch: 64 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=49/50 n=1 ec=39/29 lis/c=49/49 les/c/f=50/50/0 sis=64 pruub=11.269297600s) [1,3,2] r=-1 lpr=64 pi=[49,64)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1235.336181641s@ mbc={}] start_peering_interval up [4,0,1] -> [1,3,2], acting [4,0,1] -> [1,3,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:13:05 localhost ceph-osd[31500]: osd.0 pg_epoch: 64 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=49/50 n=1 ec=39/29 lis/c=49/49 les/c/f=50/50/0 sis=64 pruub=11.269159317s) [1,3,2] r=-1 lpr=64 pi=[49,64)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1235.336181641s@ mbc={}] state: transitioning to Stray Oct 14 04:13:05 localhost python3[57683]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpkfla1koo recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:13:05 localhost python3[57713]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:06 localhost ceph-osd[32440]: osd.3 pg_epoch: 64 pg[7.d( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=49/49 les/c/f=50/50/0 sis=64) [1,3,2] r=1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Oct 14 04:13:06 localhost python3[57816]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Oct 14 04:13:07 localhost ceph-osd[31500]: osd.0 pg_epoch: 66 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=51/52 n=1 ec=39/29 lis/c=51/51 les/c/f=52/52/0 sis=66 pruub=11.050047874s) [3,4,1] r=-1 lpr=66 pi=[51,66)/1 crt=31'39 mlcod 0'0 active pruub 1237.169677734s@ mbc={255={}}] start_peering_interval up [0,1,4] -> [3,4,1], acting [0,1,4] -> [3,4,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:13:07 localhost ceph-osd[31500]: osd.0 pg_epoch: 66 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=51/52 n=1 ec=39/29 lis/c=51/51 les/c/f=52/52/0 sis=66 pruub=11.049933434s) [3,4,1] r=-1 lpr=66 pi=[51,66)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1237.169677734s@ mbc={}] state: transitioning to Stray Oct 14 04:13:07 localhost ceph-osd[32440]: osd.3 pg_epoch: 66 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=51/51 les/c/f=52/52/0 sis=66) [3,4,1] r=0 lpr=66 pi=[51,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:13:07 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.18 scrub starts Oct 14 04:13:07 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.18 scrub ok Oct 14 04:13:07 localhost python3[57835]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:08 localhost ceph-osd[32440]: osd.3 pg_epoch: 67 pg[7.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=66/67 n=1 ec=39/29 lis/c=51/51 les/c/f=52/52/0 sis=66) [3,4,1] r=0 lpr=66 pi=[51,66)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Oct 14 04:13:08 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.16 scrub starts Oct 14 04:13:08 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.16 scrub ok Oct 14 04:13:08 localhost python3[57867]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:13:09 localhost python3[57917]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:09 localhost ceph-osd[32440]: osd.3 pg_epoch: 68 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=39/29 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=10.661118507s) [0,4,5] r=-1 lpr=68 pi=[53,68)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1234.788208008s@ mbc={}] start_peering_interval up [1,4,3] -> [0,4,5], acting [1,4,3] -> [0,4,5], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Oct 14 04:13:09 localhost ceph-osd[32440]: osd.3 pg_epoch: 68 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=39/29 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=10.661038399s) [0,4,5] r=-1 lpr=68 pi=[53,68)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1234.788208008s@ mbc={}] state: transitioning to Stray Oct 14 04:13:09 localhost ceph-osd[31500]: osd.0 pg_epoch: 68 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/29 lis/c=53/53 les/c/f=54/54/0 sis=68) [0,4,5] r=0 lpr=68 pi=[53,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Oct 14 04:13:09 localhost python3[57935]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:10 localhost python3[57997]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:10 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.12 scrub starts Oct 14 04:13:10 localhost ceph-osd[31500]: osd.0 pg_epoch: 69 pg[7.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=68/69 n=1 ec=39/29 lis/c=53/53 les/c/f=54/54/0 sis=68) [0,4,5] r=0 lpr=68 pi=[53,68)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(1+2)=3}}] state: react AllReplicasActivated Activating complete Oct 14 04:13:10 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.12 scrub ok Oct 14 04:13:10 localhost python3[58015]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:11 localhost python3[58077]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:11 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.c scrub starts Oct 14 04:13:11 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.c scrub ok Oct 14 04:13:11 localhost python3[58095]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:11 localhost python3[58157]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:12 localhost python3[58175]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:12 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.10 deep-scrub starts Oct 14 04:13:12 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 4.10 deep-scrub ok Oct 14 04:13:12 localhost python3[58205]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:13:12 localhost systemd[1]: Reloading. Oct 14 04:13:12 localhost systemd-rc-local-generator[58225]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:13:12 localhost systemd-sysv-generator[58230]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:13:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:13:13 localhost systemd[1]: Starting dnf makecache... Oct 14 04:13:13 localhost dnf[58243]: Updating Subscription Management repositories. Oct 14 04:13:13 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.17 scrub starts Oct 14 04:13:13 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.17 scrub ok Oct 14 04:13:13 localhost python3[58292]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:13 localhost python3[58310]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:14 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.1d scrub starts Oct 14 04:13:14 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.1d scrub ok Oct 14 04:13:14 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.a scrub starts Oct 14 04:13:14 localhost python3[58372]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:13:14 localhost python3[58390]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:14 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.a scrub ok Oct 14 04:13:15 localhost dnf[58243]: Metadata cache refreshed recently. Oct 14 04:13:15 localhost python3[58420]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:13:15 localhost systemd[1]: Reloading. Oct 14 04:13:15 localhost systemd-rc-local-generator[58446]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:13:15 localhost systemd-sysv-generator[58450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:13:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:13:15 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Oct 14 04:13:15 localhost systemd[1]: Finished dnf makecache. Oct 14 04:13:15 localhost systemd[1]: dnf-makecache.service: Consumed 2.233s CPU time. Oct 14 04:13:15 localhost systemd[1]: Starting Create netns directory... Oct 14 04:13:15 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 04:13:15 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 04:13:15 localhost systemd[1]: Finished Create netns directory. Oct 14 04:13:16 localhost python3[58480]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Oct 14 04:13:17 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts Oct 14 04:13:17 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok Oct 14 04:13:17 localhost python3[58538]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Oct 14 04:13:18 localhost podman[58613]: 2025-10-14 08:13:18.195022144 +0000 UTC m=+0.057236640 container create 2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-07-21T14:56:59, distribution-scope=public, release=2, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, container_name=nova_virtqemud_init_logs, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:13:18 localhost systemd[1]: Started libpod-conmon-2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f.scope. Oct 14 04:13:18 localhost systemd[1]: Started libcrun container. Oct 14 04:13:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14055c4da6e79e5c651346093268013a82ff4f993f6084d00241e7b8936c8586/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Oct 14 04:13:18 localhost podman[58619]: 2025-10-14 08:13:18.255012228 +0000 UTC m=+0.101518043 container create 028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:48:37, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, io.buildah.version=1.33.12, container_name=nova_compute_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Oct 14 04:13:18 localhost podman[58613]: 2025-10-14 08:13:18.261376706 +0000 UTC m=+0.123591202 container init 2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vendor=Red Hat, Inc., release=2) Oct 14 04:13:18 localhost podman[58613]: 2025-10-14 08:13:18.169138014 +0000 UTC m=+0.031352520 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:13:18 localhost podman[58613]: 2025-10-14 08:13:18.270376667 +0000 UTC m=+0.132591163 container start 2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.buildah.version=1.33.12, build-date=2025-07-21T14:56:59, container_name=nova_virtqemud_init_logs, maintainer=OpenStack TripleO Team, release=2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.openshift.expose-services=) Oct 14 04:13:18 localhost python3[58538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1760428406 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Oct 14 04:13:18 localhost systemd[1]: libpod-2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f.scope: Deactivated successfully. Oct 14 04:13:18 localhost podman[58619]: 2025-10-14 08:13:18.205067907 +0000 UTC m=+0.051573802 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:13:18 localhost systemd[1]: Started libpod-conmon-028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8.scope. Oct 14 04:13:18 localhost podman[58646]: 2025-10-14 08:13:18.348001723 +0000 UTC m=+0.061633747 container died 2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T14:56:59, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, vcs-type=git, batch=17.1_20250721.1, release=2, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step2, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, distribution-scope=public, maintainer=OpenStack TripleO Team) Oct 14 04:13:18 localhost systemd[1]: Started libcrun container. Oct 14 04:13:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b088732740630f4022bff32bb793337565a1d5042b99108823757a338ad7c00/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:13:18 localhost podman[58619]: 2025-10-14 08:13:18.373471089 +0000 UTC m=+0.219976904 container init 028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step2, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, version=17.1.9, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:13:18 localhost podman[58619]: 2025-10-14 08:13:18.378723483 +0000 UTC m=+0.225229318 container start 028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, container_name=nova_compute_init_log, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:13:18 localhost python3[58538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1760428406 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Oct 14 04:13:18 localhost systemd[1]: libpod-028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8.scope: Deactivated successfully. Oct 14 04:13:18 localhost podman[58679]: 2025-10-14 08:13:18.433486494 +0000 UTC m=+0.040391073 container died 028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T14:48:37, container_name=nova_compute_init_log, config_id=tripleo_step2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true) Oct 14 04:13:18 localhost podman[58680]: 2025-10-14 08:13:18.469546141 +0000 UTC m=+0.072168216 container cleanup 028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, name=rhosp17/openstack-nova-compute, config_id=tripleo_step2, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_compute_init_log, managed_by=tripleo_ansible) Oct 14 04:13:18 localhost systemd[1]: libpod-conmon-028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8.scope: Deactivated successfully. Oct 14 04:13:18 localhost podman[58647]: 2025-10-14 08:13:18.530323879 +0000 UTC m=+0.235732436 container cleanup 2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtqemud_init_logs, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team) Oct 14 04:13:18 localhost systemd[1]: libpod-conmon-2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f.scope: Deactivated successfully. Oct 14 04:13:18 localhost podman[58798]: 2025-10-14 08:13:18.72172304 +0000 UTC m=+0.053780652 container create 2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1, architecture=x86_64, container_name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:13:18 localhost systemd[1]: Started libpod-conmon-2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e.scope. Oct 14 04:13:18 localhost systemd[1]: Started libcrun container. Oct 14 04:13:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26c7b014151266d081bb0d73c08f9962548726db348c3c8122e5e1462f16ca73/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 04:13:18 localhost podman[58798]: 2025-10-14 08:13:18.787138253 +0000 UTC m=+0.119195865 container init 2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, architecture=x86_64, container_name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1) Oct 14 04:13:18 localhost podman[58798]: 2025-10-14 08:13:18.69293412 +0000 UTC m=+0.024991702 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Oct 14 04:13:18 localhost podman[58798]: 2025-10-14 08:13:18.800637715 +0000 UTC m=+0.132695317 container start 2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, io.buildah.version=1.33.12, architecture=x86_64, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1) Oct 14 04:13:18 localhost podman[58798]: 2025-10-14 08:13:18.800981596 +0000 UTC m=+0.133039198 container attach 2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, container_name=create_haproxy_wrapper, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64) Oct 14 04:13:18 localhost podman[58812]: 2025-10-14 08:13:18.826921186 +0000 UTC m=+0.137311131 container create dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, release=2, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, vcs-type=git, container_name=create_virtlogd_wrapper) Oct 14 04:13:18 localhost systemd[1]: Started libpod-conmon-dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68.scope. Oct 14 04:13:18 localhost systemd[1]: Started libcrun container. Oct 14 04:13:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b323c7b12cc075908cfc59295640d329582d5902bacc1e83223968c79062b1a1/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Oct 14 04:13:18 localhost podman[58812]: 2025-10-14 08:13:18.871168378 +0000 UTC m=+0.181558333 container init dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, container_name=create_virtlogd_wrapper, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, vcs-type=git, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vendor=Red Hat, Inc., config_id=tripleo_step2) Oct 14 04:13:18 localhost podman[58812]: 2025-10-14 08:13:18.877473515 +0000 UTC m=+0.187863480 container start dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, build-date=2025-07-21T14:56:59, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-nova-libvirt, version=17.1.9, container_name=create_virtlogd_wrapper, batch=17.1_20250721.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, distribution-scope=public, config_id=tripleo_step2, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team) Oct 14 04:13:18 localhost podman[58812]: 2025-10-14 08:13:18.877738273 +0000 UTC m=+0.188128308 container attach dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=create_virtlogd_wrapper, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, distribution-scope=public, config_id=tripleo_step2, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0) Oct 14 04:13:18 localhost podman[58812]: 2025-10-14 08:13:18.782776286 +0000 UTC m=+0.093166261 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:13:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-028441b8b364569d2cd3f7d87adb93dd643373f82ad54bd1969c04c9ad47c8a8-userdata-shm.mount: Deactivated successfully. Oct 14 04:13:19 localhost systemd[1]: var-lib-containers-storage-overlay-14055c4da6e79e5c651346093268013a82ff4f993f6084d00241e7b8936c8586-merged.mount: Deactivated successfully. Oct 14 04:13:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f-userdata-shm.mount: Deactivated successfully. Oct 14 04:13:20 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.5 scrub starts Oct 14 04:13:20 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.5 scrub ok Oct 14 04:13:20 localhost ovs-vsctl[58986]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Oct 14 04:13:21 localhost systemd[1]: libpod-dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68.scope: Deactivated successfully. Oct 14 04:13:21 localhost systemd[1]: libpod-dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68.scope: Consumed 2.274s CPU time. Oct 14 04:13:21 localhost podman[58812]: 2025-10-14 08:13:21.149398077 +0000 UTC m=+2.459788112 container died dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, distribution-scope=public, config_id=tripleo_step2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, vcs-type=git, container_name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, tcib_managed=true, release=2, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 04:13:21 localhost systemd[1]: tmp-crun.rMUu0w.mount: Deactivated successfully. Oct 14 04:13:21 localhost systemd[1]: tmp-crun.JBtuoo.mount: Deactivated successfully. Oct 14 04:13:21 localhost podman[59054]: 2025-10-14 08:13:21.264812743 +0000 UTC m=+0.102589536 container cleanup dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=create_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:13:21 localhost systemd[1]: libpod-conmon-dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68.scope: Deactivated successfully. Oct 14 04:13:21 localhost python3[58538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1760428406 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Oct 14 04:13:21 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.b scrub starts Oct 14 04:13:21 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.b scrub ok Oct 14 04:13:21 localhost systemd[1]: libpod-2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e.scope: Deactivated successfully. Oct 14 04:13:21 localhost systemd[1]: libpod-2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e.scope: Consumed 2.092s CPU time. Oct 14 04:13:21 localhost podman[58798]: 2025-10-14 08:13:21.860983769 +0000 UTC m=+3.193041421 container died 2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git) Oct 14 04:13:21 localhost podman[59093]: 2025-10-14 08:13:21.941063891 +0000 UTC m=+0.073809187 container cleanup 2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=create_haproxy_wrapper, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step2, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:13:21 localhost systemd[1]: libpod-conmon-2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e.scope: Deactivated successfully. Oct 14 04:13:21 localhost python3[58538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Oct 14 04:13:22 localhost systemd[1]: var-lib-containers-storage-overlay-b323c7b12cc075908cfc59295640d329582d5902bacc1e83223968c79062b1a1-merged.mount: Deactivated successfully. Oct 14 04:13:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dcc82f78347f4eb88779691429eefe902968d962a3c0a069fd6e706ce0435a68-userdata-shm.mount: Deactivated successfully. Oct 14 04:13:22 localhost systemd[1]: var-lib-containers-storage-overlay-26c7b014151266d081bb0d73c08f9962548726db348c3c8122e5e1462f16ca73-merged.mount: Deactivated successfully. Oct 14 04:13:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e-userdata-shm.mount: Deactivated successfully. Oct 14 04:13:22 localhost python3[59147]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:23 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.16 deep-scrub starts Oct 14 04:13:24 localhost python3[59268]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005486733 step=2 update_config_hash_only=False Oct 14 04:13:24 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.4 scrub starts Oct 14 04:13:24 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.4 scrub ok Oct 14 04:13:24 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.1f scrub starts Oct 14 04:13:24 localhost python3[59284]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:13:24 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.1f scrub ok Oct 14 04:13:25 localhost python3[59300]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Oct 14 04:13:25 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.c scrub starts Oct 14 04:13:25 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.c scrub ok Oct 14 04:13:26 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.7 deep-scrub starts Oct 14 04:13:26 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.7 deep-scrub ok Oct 14 04:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:13:28 localhost systemd[1]: tmp-crun.QxZyGO.mount: Deactivated successfully. Oct 14 04:13:28 localhost podman[59301]: 2025-10-14 08:13:28.732525087 +0000 UTC m=+0.077340538 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-type=git, version=17.1.9, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=) Oct 14 04:13:28 localhost podman[59301]: 2025-10-14 08:13:28.947016419 +0000 UTC m=+0.291831830 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64) Oct 14 04:13:28 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:13:32 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.4 scrub starts Oct 14 04:13:32 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.4 scrub ok Oct 14 04:13:33 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.12 scrub starts Oct 14 04:13:33 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.12 scrub ok Oct 14 04:13:34 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.f scrub starts Oct 14 04:13:34 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.f scrub ok Oct 14 04:13:35 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.1e scrub starts Oct 14 04:13:35 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.1e scrub ok Oct 14 04:13:37 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.18 scrub starts Oct 14 04:13:37 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.18 scrub ok Oct 14 04:13:38 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.b scrub starts Oct 14 04:13:38 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.b scrub ok Oct 14 04:13:39 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.1e scrub starts Oct 14 04:13:39 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 5.1e scrub ok Oct 14 04:13:40 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.b deep-scrub starts Oct 14 04:13:40 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.b deep-scrub ok Oct 14 04:13:41 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.4 scrub starts Oct 14 04:13:41 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.4 scrub ok Oct 14 04:13:41 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.1e deep-scrub starts Oct 14 04:13:41 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.1e deep-scrub ok Oct 14 04:13:48 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.6 scrub starts Oct 14 04:13:48 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.6 scrub ok Oct 14 04:13:48 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.14 scrub starts Oct 14 04:13:48 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.14 scrub ok Oct 14 04:13:49 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.17 scrub starts Oct 14 04:13:49 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.17 scrub ok Oct 14 04:13:50 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.9 scrub starts Oct 14 04:13:50 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.9 scrub ok Oct 14 04:13:51 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.14 deep-scrub starts Oct 14 04:13:51 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 5.14 deep-scrub ok Oct 14 04:13:54 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.13 scrub starts Oct 14 04:13:54 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.13 scrub ok Oct 14 04:13:55 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.f scrub starts Oct 14 04:13:55 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 7.f scrub ok Oct 14 04:13:55 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.6 scrub starts Oct 14 04:13:55 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.6 scrub ok Oct 14 04:13:56 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.11 scrub starts Oct 14 04:13:56 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.11 scrub ok Oct 14 04:13:58 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.1d scrub starts Oct 14 04:13:58 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 6.1d scrub ok Oct 14 04:13:59 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.0 scrub starts Oct 14 04:13:59 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.0 scrub ok Oct 14 04:13:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:13:59 localhost systemd[1]: tmp-crun.I3k9mT.mount: Deactivated successfully. Oct 14 04:13:59 localhost podman[59330]: 2025-10-14 08:13:59.75734867 +0000 UTC m=+0.098421142 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git) Oct 14 04:13:59 localhost podman[59330]: 2025-10-14 08:13:59.956211786 +0000 UTC m=+0.297284248 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:13:59 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:14:01 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.19 deep-scrub starts Oct 14 04:14:01 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.2 scrub starts Oct 14 04:14:01 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 3.2 scrub ok Oct 14 04:14:03 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.10 scrub starts Oct 14 04:14:03 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 6.10 scrub ok Oct 14 04:14:05 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.16 scrub starts Oct 14 04:14:05 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 4.16 scrub ok Oct 14 04:14:05 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.2 scrub starts Oct 14 04:14:05 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.2 scrub ok Oct 14 04:14:06 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.19 scrub starts Oct 14 04:14:06 localhost ceph-osd[31500]: log_channel(cluster) log [DBG] : 3.19 scrub ok Oct 14 04:14:07 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.3 scrub starts Oct 14 04:14:07 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.3 scrub ok Oct 14 04:14:09 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.8 scrub starts Oct 14 04:14:09 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.8 scrub ok Oct 14 04:14:11 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.b scrub starts Oct 14 04:14:11 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.b scrub ok Oct 14 04:14:15 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.e scrub starts Oct 14 04:14:15 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 7.e scrub ok Oct 14 04:14:16 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.4 scrub starts Oct 14 04:14:16 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.4 scrub ok Oct 14 04:14:23 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.19 scrub starts Oct 14 04:14:23 localhost ceph-osd[32440]: log_channel(cluster) log [DBG] : 2.19 scrub ok Oct 14 04:14:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:14:30 localhost podman[59436]: 2025-10-14 08:14:30.735545374 +0000 UTC m=+0.074522705 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, managed_by=tripleo_ansible, vcs-type=git) Oct 14 04:14:30 localhost podman[59436]: 2025-10-14 08:14:30.923892454 +0000 UTC m=+0.262869835 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true) Oct 14 04:14:30 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:15:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:15:01 localhost systemd[1]: tmp-crun.VY8su4.mount: Deactivated successfully. Oct 14 04:15:01 localhost podman[59465]: 2025-10-14 08:15:01.730588907 +0000 UTC m=+0.076338290 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr) Oct 14 04:15:01 localhost podman[59465]: 2025-10-14 08:15:01.9122026 +0000 UTC m=+0.257951983 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, architecture=x86_64) Oct 14 04:15:01 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:15:03 localhost podman[59593]: 2025-10-14 08:15:03.110642616 +0000 UTC m=+0.090851079 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, RELEASE=main, release=553, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 04:15:03 localhost podman[59593]: 2025-10-14 08:15:03.24927094 +0000 UTC m=+0.229479403 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, release=553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_BRANCH=main) Oct 14 04:15:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:15:32 localhost systemd[1]: tmp-crun.yDUBd3.mount: Deactivated successfully. Oct 14 04:15:32 localhost podman[59738]: 2025-10-14 08:15:32.774068018 +0000 UTC m=+0.112194158 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20250721.1, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, io.buildah.version=1.33.12, release=1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1) Oct 14 04:15:32 localhost podman[59738]: 2025-10-14 08:15:32.96893202 +0000 UTC m=+0.307058180 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr) Oct 14 04:15:32 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:16:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:16:03 localhost podman[59768]: 2025-10-14 08:16:03.738647086 +0000 UTC m=+0.080721518 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1) Oct 14 04:16:03 localhost podman[59768]: 2025-10-14 08:16:03.939608357 +0000 UTC m=+0.281682829 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64) Oct 14 04:16:03 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:16:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:16:34 localhost systemd[1]: tmp-crun.h8bxPI.mount: Deactivated successfully. Oct 14 04:16:34 localhost podman[59874]: 2025-10-14 08:16:34.753135362 +0000 UTC m=+0.091869977 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, container_name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public) Oct 14 04:16:34 localhost podman[59874]: 2025-10-14 08:16:34.968813423 +0000 UTC m=+0.307548028 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git) Oct 14 04:16:34 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:17:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:17:05 localhost systemd[1]: tmp-crun.fzNEAP.mount: Deactivated successfully. Oct 14 04:17:05 localhost podman[59904]: 2025-10-14 08:17:05.739216269 +0000 UTC m=+0.078084465 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container) Oct 14 04:17:05 localhost podman[59904]: 2025-10-14 08:17:05.954115846 +0000 UTC m=+0.292984052 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.9, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Oct 14 04:17:05 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:17:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:17:36 localhost podman[60010]: 2025-10-14 08:17:36.742821704 +0000 UTC m=+0.080791440 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:17:36 localhost podman[60010]: 2025-10-14 08:17:36.931055396 +0000 UTC m=+0.269025092 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, release=1, architecture=x86_64, vendor=Red Hat, Inc.) Oct 14 04:17:36 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:17:53 localhost python3[60086]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:17:54 localhost python3[60131]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429873.4231718-99458-102834636835211/source _original_basename=tmpouapecgk follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:17:55 localhost python3[60161]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:17:56 localhost ansible-async_wrapper.py[60333]: Invoked with 535604242096 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429876.2356217-99629-144981378054228/AnsiballZ_command.py _ Oct 14 04:17:56 localhost ansible-async_wrapper.py[60336]: Starting module and watcher Oct 14 04:17:56 localhost ansible-async_wrapper.py[60336]: Start watching 60337 (3600) Oct 14 04:17:56 localhost ansible-async_wrapper.py[60337]: Start module (60337) Oct 14 04:17:56 localhost ansible-async_wrapper.py[60333]: Return async_wrapper task started. Oct 14 04:17:57 localhost python3[60357]: ansible-ansible.legacy.async_status Invoked with jid=535604242096.60333 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:18:00 localhost puppet-user[60354]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:18:00 localhost puppet-user[60354]: (file: /etc/puppet/hiera.yaml) Oct 14 04:18:00 localhost puppet-user[60354]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:18:00 localhost puppet-user[60354]: (file & line not available) Oct 14 04:18:00 localhost puppet-user[60354]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:18:00 localhost puppet-user[60354]: (file & line not available) Oct 14 04:18:00 localhost puppet-user[60354]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Oct 14 04:18:00 localhost puppet-user[60354]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Oct 14 04:18:00 localhost puppet-user[60354]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.13 seconds Oct 14 04:18:00 localhost puppet-user[60354]: Notice: Applied catalog in 0.04 seconds Oct 14 04:18:00 localhost puppet-user[60354]: Application: Oct 14 04:18:00 localhost puppet-user[60354]: Initial environment: production Oct 14 04:18:00 localhost puppet-user[60354]: Converged environment: production Oct 14 04:18:00 localhost puppet-user[60354]: Run mode: user Oct 14 04:18:00 localhost puppet-user[60354]: Changes: Oct 14 04:18:00 localhost puppet-user[60354]: Events: Oct 14 04:18:00 localhost puppet-user[60354]: Resources: Oct 14 04:18:00 localhost puppet-user[60354]: Total: 10 Oct 14 04:18:00 localhost puppet-user[60354]: Time: Oct 14 04:18:00 localhost puppet-user[60354]: Schedule: 0.00 Oct 14 04:18:00 localhost puppet-user[60354]: File: 0.00 Oct 14 04:18:00 localhost puppet-user[60354]: Exec: 0.01 Oct 14 04:18:00 localhost puppet-user[60354]: Augeas: 0.01 Oct 14 04:18:00 localhost puppet-user[60354]: Transaction evaluation: 0.03 Oct 14 04:18:00 localhost puppet-user[60354]: Catalog application: 0.04 Oct 14 04:18:00 localhost puppet-user[60354]: Config retrieval: 0.16 Oct 14 04:18:00 localhost puppet-user[60354]: Last run: 1760429880 Oct 14 04:18:00 localhost puppet-user[60354]: Filebucket: 0.00 Oct 14 04:18:00 localhost puppet-user[60354]: Total: 0.04 Oct 14 04:18:00 localhost puppet-user[60354]: Version: Oct 14 04:18:00 localhost puppet-user[60354]: Config: 1760429880 Oct 14 04:18:00 localhost puppet-user[60354]: Puppet: 7.10.0 Oct 14 04:18:00 localhost ansible-async_wrapper.py[60337]: Module complete (60337) Oct 14 04:18:01 localhost ansible-async_wrapper.py[60336]: Done in kid B. Oct 14 04:18:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:18:07 localhost podman[60484]: 2025-10-14 08:18:07.336750875 +0000 UTC m=+0.062933281 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible) Oct 14 04:18:07 localhost python3[60483]: ansible-ansible.legacy.async_status Invoked with jid=535604242096.60333 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:18:07 localhost podman[60484]: 2025-10-14 08:18:07.505019302 +0000 UTC m=+0.231201748 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step1, vcs-type=git) Oct 14 04:18:07 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:18:08 localhost python3[60568]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:18:08 localhost python3[60603]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:09 localhost python3[60667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:09 localhost python3[60686]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp2oiurdo1 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:18:09 localhost python3[60716]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:11 localhost python3[60820]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Oct 14 04:18:12 localhost python3[60839]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:13 localhost python3[60871]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:14 localhost python3[60921]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:14 localhost python3[60939]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:14 localhost python3[61001]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:15 localhost python3[61019]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:15 localhost python3[61081]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:16 localhost python3[61099]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:16 localhost python3[61161]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:16 localhost python3[61179]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:17 localhost python3[61209]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:17 localhost systemd[1]: Reloading. Oct 14 04:18:17 localhost systemd-rc-local-generator[61230]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:17 localhost systemd-sysv-generator[61233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:18 localhost python3[61295]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:18 localhost python3[61313]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:19 localhost python3[61375]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:18:19 localhost python3[61393]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:19 localhost python3[61423]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:19 localhost systemd[1]: Reloading. Oct 14 04:18:20 localhost systemd-sysv-generator[61453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:20 localhost systemd-rc-local-generator[61450]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:20 localhost systemd[1]: Starting Create netns directory... Oct 14 04:18:20 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 04:18:20 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 04:18:20 localhost systemd[1]: Finished Create netns directory. Oct 14 04:18:20 localhost python3[61480]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Oct 14 04:18:22 localhost python3[61538]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Oct 14 04:18:23 localhost podman[61700]: 2025-10-14 08:18:23.162607491 +0000 UTC m=+0.069735754 container create 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, config_id=tripleo_step3, release=2, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Oct 14 04:18:23 localhost podman[61706]: 2025-10-14 08:18:23.183249618 +0000 UTC m=+0.085919551 container create b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, architecture=x86_64, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=nova_statedir_owner, version=17.1.9, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:18:23 localhost systemd[1]: Started libpod-conmon-0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.scope. Oct 14 04:18:23 localhost podman[61691]: 2025-10-14 08:18:23.203803181 +0000 UTC m=+0.121691530 container create 0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:18:23 localhost systemd[1]: Started libpod-conmon-b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b.scope. Oct 14 04:18:23 localhost systemd[1]: Started libcrun container. Oct 14 04:18:23 localhost systemd[1]: Started libcrun container. Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edfc0b06a4a796de9d2029c8d0ee2f6200965b91068a8e771289702817852d05/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edfc0b06a4a796de9d2029c8d0ee2f6200965b91068a8e771289702817852d05/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf7b7bb08b691d902a723a4644d1ae132381580429bddbb6a2334ee503c366a0/merged/scripts supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edfc0b06a4a796de9d2029c8d0ee2f6200965b91068a8e771289702817852d05/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf7b7bb08b691d902a723a4644d1ae132381580429bddbb6a2334ee503c366a0/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost podman[61726]: 2025-10-14 08:18:23.226097349 +0000 UTC m=+0.110131688 container create 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, release=2, architecture=x86_64, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59) Oct 14 04:18:23 localhost podman[61700]: 2025-10-14 08:18:23.128801603 +0000 UTC m=+0.035929906 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Oct 14 04:18:23 localhost podman[61706]: 2025-10-14 08:18:23.230154285 +0000 UTC m=+0.132824228 container init b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_statedir_owner, build-date=2025-07-21T14:48:37, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, config_id=tripleo_step3, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12) Oct 14 04:18:23 localhost podman[61706]: 2025-10-14 08:18:23.130551918 +0000 UTC m=+0.033221861 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:18:23 localhost systemd[1]: Started libpod-conmon-0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d.scope. Oct 14 04:18:23 localhost podman[61691]: 2025-10-14 08:18:23.135977758 +0000 UTC m=+0.053866137 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Oct 14 04:18:23 localhost podman[61706]: 2025-10-14 08:18:23.238226558 +0000 UTC m=+0.140896511 container start b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_statedir_owner, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step3, tcib_managed=true) Oct 14 04:18:23 localhost podman[61706]: 2025-10-14 08:18:23.239824078 +0000 UTC m=+0.142494021 container attach b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vendor=Red Hat, Inc., container_name=nova_statedir_owner, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step3) Oct 14 04:18:23 localhost systemd[1]: Started libcrun container. Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d54c722c2284b3655acb982a0155c0f83e97b8ed3151318b123cb4230b9dfcf/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost podman[61691]: 2025-10-14 08:18:23.251202085 +0000 UTC m=+0.169090444 container init 0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_init_log, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:18:23 localhost podman[61715]: 2025-10-14 08:18:23.156340225 +0000 UTC m=+0.049464010 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Oct 14 04:18:23 localhost podman[61726]: 2025-10-14 08:18:23.156959694 +0000 UTC m=+0.040994043 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:23 localhost podman[61691]: 2025-10-14 08:18:23.257255614 +0000 UTC m=+0.175143973 container start 0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_init_log, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.9) Oct 14 04:18:23 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Oct 14 04:18:23 localhost systemd[1]: libpod-0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d.scope: Deactivated successfully. Oct 14 04:18:23 localhost systemd[1]: libpod-b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b.scope: Deactivated successfully. Oct 14 04:18:23 localhost podman[61715]: 2025-10-14 08:18:23.277558289 +0000 UTC m=+0.170682034 container create b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, release=1, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, version=17.1.9, build-date=2025-07-21T12:58:40, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, tcib_managed=true) Oct 14 04:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:18:23 localhost podman[61700]: 2025-10-14 08:18:23.300420085 +0000 UTC m=+0.207548388 container init 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:18:23 localhost podman[61781]: 2025-10-14 08:18:23.318662586 +0000 UTC m=+0.047065584 container died 0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_init_log, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3) Oct 14 04:18:23 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:23 localhost podman[61706]: 2025-10-14 08:18:23.329466394 +0000 UTC m=+0.232136347 container died b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, distribution-scope=public, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64) Oct 14 04:18:23 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 04:18:23 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 04:18:23 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 04:18:23 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 04:18:23 localhost podman[61798]: 2025-10-14 08:18:23.394215941 +0000 UTC m=+0.107277119 container cleanup b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.expose-services=, version=17.1.9, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, build-date=2025-07-21T14:48:37, config_id=tripleo_step3, container_name=nova_statedir_owner, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, vcs-type=git) Oct 14 04:18:23 localhost systemd[1]: libpod-conmon-b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b.scope: Deactivated successfully. Oct 14 04:18:23 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1760428406 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Oct 14 04:18:23 localhost systemd[1]: Started libpod-conmon-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope. Oct 14 04:18:23 localhost podman[61700]: 2025-10-14 08:18:23.423102525 +0000 UTC m=+0.330230798 container start 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, name=rhosp17/openstack-collectd) Oct 14 04:18:23 localhost systemd[1]: Started libcrun container. Oct 14 04:18:23 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d31718fcd17fdeee6489534105191c7a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost podman[61823]: 2025-10-14 08:18:23.463554012 +0000 UTC m=+0.129769824 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, vendor=Red Hat, Inc., config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, container_name=collectd, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:18:23 localhost podman[61783]: 2025-10-14 08:18:23.483114844 +0000 UTC m=+0.213971679 container cleanup 0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T15:29:47, container_name=ceilometer_init_log, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Oct 14 04:18:23 localhost systemd[61836]: Queued start job for default target Main User Target. Oct 14 04:18:23 localhost systemd[1]: libpod-conmon-0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d.scope: Deactivated successfully. Oct 14 04:18:23 localhost systemd[61836]: Created slice User Application Slice. Oct 14 04:18:23 localhost systemd[61836]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 04:18:23 localhost systemd[61836]: Started Daily Cleanup of User's Temporary Directories. Oct 14 04:18:23 localhost systemd[61836]: Reached target Paths. Oct 14 04:18:23 localhost systemd[61836]: Reached target Timers. Oct 14 04:18:23 localhost systemd[61836]: Starting D-Bus User Message Bus Socket... Oct 14 04:18:23 localhost systemd[61836]: Starting Create User's Volatile Files and Directories... Oct 14 04:18:23 localhost podman[61715]: 2025-10-14 08:18:23.495918534 +0000 UTC m=+0.389042309 container init b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, version=17.1.9, tcib_managed=true, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, build-date=2025-07-21T12:58:40, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, vcs-type=git, distribution-scope=public, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Oct 14 04:18:23 localhost podman[61715]: 2025-10-14 08:18:23.506485285 +0000 UTC m=+0.399609060 container start b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, architecture=x86_64, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, build-date=2025-07-21T12:58:40, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-rsyslog, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container) Oct 14 04:18:23 localhost systemd[61836]: Listening on D-Bus User Message Bus Socket. Oct 14 04:18:23 localhost systemd[1]: Started libpod-conmon-5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6.scope. Oct 14 04:18:23 localhost systemd[61836]: Reached target Sockets. Oct 14 04:18:23 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=dda1083e68f30de2da9a23107b96824d --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Oct 14 04:18:23 localhost systemd[61836]: Finished Create User's Volatile Files and Directories. Oct 14 04:18:23 localhost systemd[61836]: Reached target Basic System. Oct 14 04:18:23 localhost systemd[61836]: Reached target Main User Target. Oct 14 04:18:23 localhost systemd[61836]: Startup finished in 126ms. Oct 14 04:18:23 localhost systemd[1]: Started User Manager for UID 0. Oct 14 04:18:23 localhost systemd[1]: Started Session c1 of User root. Oct 14 04:18:23 localhost systemd[1]: Started libcrun container. Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost podman[61726]: 2025-10-14 08:18:23.53410034 +0000 UTC m=+0.418134679 container init 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, io.openshift.expose-services=, version=17.1.9, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, release=2, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, architecture=x86_64, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2) Oct 14 04:18:23 localhost podman[61726]: 2025-10-14 08:18:23.544824336 +0000 UTC m=+0.428858665 container start 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, container_name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, release=2, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 04:18:23 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:23 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:23 localhost podman[61823]: 2025-10-14 08:18:23.575993481 +0000 UTC m=+0.242209313 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20250721.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b) Oct 14 04:18:23 localhost podman[61823]: unhealthy Oct 14 04:18:23 localhost systemd[1]: Started Session c2 of User root. Oct 14 04:18:23 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:18:23 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Failed with result 'exit-code'. Oct 14 04:18:23 localhost systemd[1]: libpod-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:23 localhost systemd[1]: session-c1.scope: Deactivated successfully. Oct 14 04:18:23 localhost podman[61944]: 2025-10-14 08:18:23.663036306 +0000 UTC m=+0.049759589 container died b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, build-date=2025-07-21T12:58:40, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, managed_by=tripleo_ansible, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, name=rhosp17/openstack-rsyslog, release=1, vendor=Red Hat, Inc.) Oct 14 04:18:23 localhost systemd[1]: session-c2.scope: Deactivated successfully. Oct 14 04:18:23 localhost podman[61944]: 2025-10-14 08:18:23.696154173 +0000 UTC m=+0.082877446 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-07-21T12:58:40, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, container_name=rsyslog, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1, io.openshift.expose-services=, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Oct 14 04:18:23 localhost systemd[1]: libpod-conmon-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:23 localhost podman[62071]: 2025-10-14 08:18:23.911821723 +0000 UTC m=+0.074805352 container create e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, release=2, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-07-21T14:56:59, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2) Oct 14 04:18:23 localhost systemd[1]: Started libpod-conmon-e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534.scope. Oct 14 04:18:23 localhost podman[62071]: 2025-10-14 08:18:23.870777888 +0000 UTC m=+0.033761537 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:23 localhost systemd[1]: Started libcrun container. Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/360c9d6681680c6086250bb7d9532a72bd783cf0586b4983df05bec6e05d323a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/360c9d6681680c6086250bb7d9532a72bd783cf0586b4983df05bec6e05d323a/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/360c9d6681680c6086250bb7d9532a72bd783cf0586b4983df05bec6e05d323a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/360c9d6681680c6086250bb7d9532a72bd783cf0586b4983df05bec6e05d323a/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:23 localhost podman[62071]: 2025-10-14 08:18:23.982599949 +0000 UTC m=+0.145583548 container init e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0) Oct 14 04:18:23 localhost podman[62071]: 2025-10-14 08:18:23.987905415 +0000 UTC m=+0.150889024 container start e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt) Oct 14 04:18:24 localhost podman[62109]: 2025-10-14 08:18:24.006320291 +0000 UTC m=+0.066120980 container create 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, release=2, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 04:18:24 localhost systemd[1]: Started libpod-conmon-02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf.scope. Oct 14 04:18:24 localhost systemd[1]: Started libcrun container. Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost podman[62109]: 2025-10-14 08:18:23.974639529 +0000 UTC m=+0.034440228 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:24 localhost podman[62109]: 2025-10-14 08:18:24.076841699 +0000 UTC m=+0.136642398 container init 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, io.buildah.version=1.33.12, release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Oct 14 04:18:24 localhost podman[62109]: 2025-10-14 08:18:24.09031648 +0000 UTC m=+0.150117179 container start 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T14:56:59, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, container_name=nova_virtsecretd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, release=2, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container) Oct 14 04:18:24 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:24 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:24 localhost systemd[1]: Started Session c3 of User root. Oct 14 04:18:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d-userdata-shm.mount: Deactivated successfully. Oct 14 04:18:24 localhost systemd[1]: var-lib-containers-storage-overlay-edfc0b06a4a796de9d2029c8d0ee2f6200965b91068a8e771289702817852d05-merged.mount: Deactivated successfully. Oct 14 04:18:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b-userdata-shm.mount: Deactivated successfully. Oct 14 04:18:24 localhost systemd[1]: session-c3.scope: Deactivated successfully. Oct 14 04:18:24 localhost podman[62252]: 2025-10-14 08:18:24.542349331 +0000 UTC m=+0.071955144 container create b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, managed_by=tripleo_ansible, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, build-date=2025-07-21T14:56:59) Oct 14 04:18:24 localhost systemd[1]: Started libpod-conmon-b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c.scope. Oct 14 04:18:24 localhost systemd[1]: Started libcrun container. Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost podman[62252]: 2025-10-14 08:18:24.504374471 +0000 UTC m=+0.033980374 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:24 localhost podman[62252]: 2025-10-14 08:18:24.61039851 +0000 UTC m=+0.140004323 container init b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, architecture=x86_64, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.9, container_name=nova_virtnodedevd, name=rhosp17/openstack-nova-libvirt, release=2, build-date=2025-07-21T14:56:59, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-libvirt-container) Oct 14 04:18:24 localhost podman[62252]: 2025-10-14 08:18:24.621909721 +0000 UTC m=+0.151515554 container start b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, build-date=2025-07-21T14:56:59, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20250721.1, io.openshift.expose-services=, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, container_name=nova_virtnodedevd, name=rhosp17/openstack-nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12) Oct 14 04:18:24 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:24 localhost podman[62265]: 2025-10-14 08:18:24.642193165 +0000 UTC m=+0.137430652 container create 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, vendor=Red Hat, Inc.) Oct 14 04:18:24 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:24 localhost systemd[1]: Started Session c4 of User root. Oct 14 04:18:24 localhost systemd[1]: Started libpod-conmon-2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.scope. Oct 14 04:18:24 localhost systemd[1]: Started libcrun container. Oct 14 04:18:24 localhost podman[62265]: 2025-10-14 08:18:24.596577288 +0000 UTC m=+0.091814855 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed0a16e5ece4aebc29c3c8ec7f3de2cd4153e83805d28599b6e1826f63716e16/merged/etc/target supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed0a16e5ece4aebc29c3c8ec7f3de2cd4153e83805d28599b6e1826f63716e16/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:18:24 localhost podman[62265]: 2025-10-14 08:18:24.72731294 +0000 UTC m=+0.222550497 container init 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1, vcs-type=git, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:18:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:18:24 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:24 localhost systemd[1]: Started Session c5 of User root. Oct 14 04:18:24 localhost systemd[1]: session-c4.scope: Deactivated successfully. Oct 14 04:18:24 localhost podman[62265]: 2025-10-14 08:18:24.819357772 +0000 UTC m=+0.314595259 container start 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.buildah.version=1.33.12) Oct 14 04:18:24 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=49c4309af9a4fea3d3f53b6222780f5a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Oct 14 04:18:24 localhost systemd[1]: session-c5.scope: Deactivated successfully. Oct 14 04:18:24 localhost podman[62308]: 2025-10-14 08:18:24.882384374 +0000 UTC m=+0.103858642 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, distribution-scope=public, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Oct 14 04:18:24 localhost kernel: Loading iSCSI transport class v2.0-870. Oct 14 04:18:24 localhost podman[62308]: 2025-10-14 08:18:24.927004871 +0000 UTC m=+0.148479149 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, config_id=tripleo_step3, container_name=iscsid) Oct 14 04:18:24 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:18:25 localhost podman[62428]: 2025-10-14 08:18:25.199122409 +0000 UTC m=+0.087208211 container create 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:56:59, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, container_name=nova_virtstoraged, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, release=2) Oct 14 04:18:25 localhost systemd[1]: Started libpod-conmon-642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee.scope. Oct 14 04:18:25 localhost podman[62428]: 2025-10-14 08:18:25.160903353 +0000 UTC m=+0.048989185 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:25 localhost systemd[1]: Started libcrun container. Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost podman[62428]: 2025-10-14 08:18:25.277642826 +0000 UTC m=+0.165728588 container init 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, build-date=2025-07-21T14:56:59, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_virtstoraged, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt) Oct 14 04:18:25 localhost podman[62428]: 2025-10-14 08:18:25.284957946 +0000 UTC m=+0.173043708 container start 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=2, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, build-date=2025-07-21T14:56:59, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, vendor=Red Hat, Inc., version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, distribution-scope=public) Oct 14 04:18:25 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:25 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:25 localhost systemd[1]: Started Session c6 of User root. Oct 14 04:18:25 localhost systemd[1]: session-c6.scope: Deactivated successfully. Oct 14 04:18:25 localhost podman[62532]: 2025-10-14 08:18:25.651546171 +0000 UTC m=+0.091908058 container create 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=2, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_virtqemud, version=17.1.9, architecture=x86_64, distribution-scope=public, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt) Oct 14 04:18:25 localhost systemd[1]: Started libpod-conmon-99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2.scope. Oct 14 04:18:25 localhost podman[62532]: 2025-10-14 08:18:25.601262227 +0000 UTC m=+0.041624194 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:25 localhost systemd[1]: Started libcrun container. Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:25 localhost podman[62532]: 2025-10-14 08:18:25.724002019 +0000 UTC m=+0.164363876 container init 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=nova_virtqemud, config_id=tripleo_step3, release=2, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:18:25 localhost podman[62532]: 2025-10-14 08:18:25.735571141 +0000 UTC m=+0.175933008 container start 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp17/openstack-nova-libvirt, version=17.1.9, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}) Oct 14 04:18:25 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:25 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:25 localhost systemd[1]: Started Session c7 of User root. Oct 14 04:18:25 localhost systemd[1]: session-c7.scope: Deactivated successfully. Oct 14 04:18:26 localhost podman[62636]: 2025-10-14 08:18:26.137692148 +0000 UTC m=+0.079984054 container create f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, container_name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, config_id=tripleo_step3, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20250721.1, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2) Oct 14 04:18:26 localhost podman[62636]: 2025-10-14 08:18:26.088609062 +0000 UTC m=+0.030901008 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:26 localhost systemd[1]: Started libpod-conmon-f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f.scope. Oct 14 04:18:26 localhost systemd[1]: Started libcrun container. Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:26 localhost podman[62636]: 2025-10-14 08:18:26.219118097 +0000 UTC m=+0.161410003 container init f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-07-21T14:56:59, config_id=tripleo_step3, container_name=nova_virtproxyd, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.33.12, distribution-scope=public, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0) Oct 14 04:18:26 localhost podman[62636]: 2025-10-14 08:18:26.228526622 +0000 UTC m=+0.170818528 container start f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, architecture=x86_64, managed_by=tripleo_ansible, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, version=17.1.9, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 04:18:26 localhost python3[61538]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:18:26 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:18:26 localhost systemd[1]: Started Session c8 of User root. Oct 14 04:18:26 localhost systemd[1]: session-c8.scope: Deactivated successfully. Oct 14 04:18:26 localhost python3[62717]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:27 localhost python3[62733]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:27 localhost python3[62749]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:27 localhost python3[62765]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:27 localhost python3[62781]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:28 localhost python3[62797]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:28 localhost python3[62813]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:28 localhost python3[62829]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:28 localhost python3[62845]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:29 localhost python3[62861]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:29 localhost python3[62877]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:29 localhost python3[62893]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:29 localhost python3[62909]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:30 localhost python3[62925]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:30 localhost python3[62941]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:30 localhost python3[62958]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:30 localhost python3[62974]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:31 localhost python3[62991]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:18:31 localhost python3[63052]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:32 localhost python3[63081]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:32 localhost python3[63110]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:33 localhost python3[63139]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:33 localhost python3[63168]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:34 localhost python3[63197]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:34 localhost python3[63226]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:35 localhost python3[63255]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:35 localhost python3[63284]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429911.1958828-100896-246869110559510/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:36 localhost python3[63300]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 04:18:36 localhost systemd[1]: Reloading. Oct 14 04:18:36 localhost systemd-sysv-generator[63325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:36 localhost systemd-rc-local-generator[63321]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:36 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 04:18:36 localhost systemd[61836]: Activating special unit Exit the Session... Oct 14 04:18:36 localhost systemd[61836]: Stopped target Main User Target. Oct 14 04:18:36 localhost systemd[61836]: Stopped target Basic System. Oct 14 04:18:36 localhost systemd[61836]: Stopped target Paths. Oct 14 04:18:36 localhost systemd[61836]: Stopped target Sockets. Oct 14 04:18:36 localhost systemd[61836]: Stopped target Timers. Oct 14 04:18:36 localhost systemd[61836]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 04:18:36 localhost systemd[61836]: Closed D-Bus User Message Bus Socket. Oct 14 04:18:36 localhost systemd[61836]: Stopped Create User's Volatile Files and Directories. Oct 14 04:18:36 localhost systemd[61836]: Removed slice User Application Slice. Oct 14 04:18:36 localhost systemd[61836]: Reached target Shutdown. Oct 14 04:18:36 localhost systemd[61836]: Finished Exit the Session. Oct 14 04:18:36 localhost systemd[61836]: Reached target Exit the Session. Oct 14 04:18:36 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 04:18:36 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 04:18:36 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 04:18:36 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 04:18:36 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 04:18:36 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 04:18:36 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 04:18:37 localhost python3[63353]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:37 localhost systemd[1]: Reloading. Oct 14 04:18:37 localhost systemd-rc-local-generator[63383]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:37 localhost systemd-sysv-generator[63387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:18:37 localhost systemd[1]: Starting collectd container... Oct 14 04:18:37 localhost systemd[1]: tmp-crun.TRTXoi.mount: Deactivated successfully. Oct 14 04:18:37 localhost podman[63393]: 2025-10-14 08:18:37.708019646 +0000 UTC m=+0.078500558 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:18:37 localhost systemd[1]: Started collectd container. Oct 14 04:18:37 localhost podman[63393]: 2025-10-14 08:18:37.865074943 +0000 UTC m=+0.235555865 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, config_id=tripleo_step1, vcs-type=git) Oct 14 04:18:37 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:18:38 localhost python3[63450]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:38 localhost systemd[1]: Reloading. Oct 14 04:18:38 localhost systemd-rc-local-generator[63475]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:38 localhost systemd-sysv-generator[63478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:38 localhost systemd[1]: Starting iscsid container... Oct 14 04:18:38 localhost systemd[1]: Started iscsid container. Oct 14 04:18:39 localhost python3[63516]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:39 localhost systemd[1]: Reloading. Oct 14 04:18:39 localhost systemd-sysv-generator[63545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:39 localhost systemd-rc-local-generator[63540]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:39 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Oct 14 04:18:39 localhost systemd[1]: Started nova_virtlogd_wrapper container. Oct 14 04:18:40 localhost python3[63583]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:40 localhost systemd[1]: Reloading. Oct 14 04:18:40 localhost systemd-sysv-generator[63616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:40 localhost systemd-rc-local-generator[63612]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:40 localhost systemd[1]: Starting nova_virtnodedevd container... Oct 14 04:18:40 localhost tripleo-start-podman-container[63623]: Creating additional drop-in dependency for "nova_virtnodedevd" (b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c) Oct 14 04:18:40 localhost systemd[1]: Reloading. Oct 14 04:18:41 localhost systemd-rc-local-generator[63684]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:41 localhost systemd-sysv-generator[63687]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:41 localhost systemd[1]: Started nova_virtnodedevd container. Oct 14 04:18:41 localhost python3[63708]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:41 localhost systemd[1]: Reloading. Oct 14 04:18:42 localhost systemd-sysv-generator[63741]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:42 localhost systemd-rc-local-generator[63738]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:42 localhost systemd[1]: Starting nova_virtproxyd container... Oct 14 04:18:42 localhost tripleo-start-podman-container[63748]: Creating additional drop-in dependency for "nova_virtproxyd" (f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f) Oct 14 04:18:42 localhost systemd[1]: Reloading. Oct 14 04:18:42 localhost systemd-sysv-generator[63809]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:42 localhost systemd-rc-local-generator[63804]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:42 localhost systemd[1]: Started nova_virtproxyd container. Oct 14 04:18:43 localhost python3[63833]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:44 localhost systemd[1]: Reloading. Oct 14 04:18:44 localhost systemd-sysv-generator[63862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:44 localhost systemd-rc-local-generator[63859]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:44 localhost systemd[1]: Starting nova_virtqemud container... Oct 14 04:18:44 localhost tripleo-start-podman-container[63872]: Creating additional drop-in dependency for "nova_virtqemud" (99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2) Oct 14 04:18:44 localhost systemd[1]: Reloading. Oct 14 04:18:44 localhost systemd-sysv-generator[63931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:44 localhost systemd-rc-local-generator[63927]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:45 localhost systemd[1]: Started nova_virtqemud container. Oct 14 04:18:45 localhost python3[63957]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:45 localhost systemd[1]: Reloading. Oct 14 04:18:45 localhost systemd-rc-local-generator[63979]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:45 localhost systemd-sysv-generator[63984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:45 localhost systemd[1]: Starting nova_virtsecretd container... Oct 14 04:18:46 localhost tripleo-start-podman-container[63996]: Creating additional drop-in dependency for "nova_virtsecretd" (02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf) Oct 14 04:18:46 localhost systemd[1]: Reloading. Oct 14 04:18:46 localhost systemd-sysv-generator[64055]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:46 localhost systemd-rc-local-generator[64051]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:46 localhost systemd[1]: Started nova_virtsecretd container. Oct 14 04:18:47 localhost python3[64080]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:47 localhost systemd[1]: Reloading. Oct 14 04:18:47 localhost systemd-sysv-generator[64110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:47 localhost systemd-rc-local-generator[64105]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:47 localhost systemd[1]: Starting nova_virtstoraged container... Oct 14 04:18:47 localhost tripleo-start-podman-container[64120]: Creating additional drop-in dependency for "nova_virtstoraged" (642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee) Oct 14 04:18:47 localhost systemd[1]: Reloading. Oct 14 04:18:47 localhost systemd-rc-local-generator[64178]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:47 localhost systemd-sysv-generator[64181]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:47 localhost systemd[1]: Started nova_virtstoraged container. Oct 14 04:18:48 localhost python3[64203]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:18:48 localhost systemd[1]: Reloading. Oct 14 04:18:48 localhost systemd-rc-local-generator[64230]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:18:48 localhost systemd-sysv-generator[64236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:18:49 localhost systemd[1]: Starting rsyslog container... Oct 14 04:18:49 localhost systemd[1]: Started libcrun container. Oct 14 04:18:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:49 localhost podman[64243]: 2025-10-14 08:18:49.17499897 +0000 UTC m=+0.124048865 container init b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, release=1, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, vcs-type=git, version=17.1.9, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, architecture=x86_64) Oct 14 04:18:49 localhost systemd[1]: tmp-crun.vP3lVi.mount: Deactivated successfully. Oct 14 04:18:49 localhost podman[64243]: 2025-10-14 08:18:49.18874271 +0000 UTC m=+0.137792585 container start b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, io.buildah.version=1.33.12, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, build-date=2025-07-21T12:58:40, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, architecture=x86_64, tcib_managed=true) Oct 14 04:18:49 localhost podman[64243]: rsyslog Oct 14 04:18:49 localhost systemd[1]: Started rsyslog container. Oct 14 04:18:49 localhost systemd[1]: libpod-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:49 localhost podman[64273]: 2025-10-14 08:18:49.359917128 +0000 UTC m=+0.056841361 container died b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20250721.1, release=1, version=17.1.9, container_name=rsyslog, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64) Oct 14 04:18:49 localhost podman[64273]: 2025-10-14 08:18:49.381157242 +0000 UTC m=+0.078081405 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-07-21T12:58:40, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=rsyslog, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, name=rhosp17/openstack-rsyslog, version=17.1.9, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public) Oct 14 04:18:49 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:18:49 localhost podman[64291]: 2025-10-14 08:18:49.460505027 +0000 UTC m=+0.048634694 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, build-date=2025-07-21T12:58:40, vcs-type=git, version=17.1.9, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, container_name=rsyslog, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.33.12) Oct 14 04:18:49 localhost podman[64291]: rsyslog Oct 14 04:18:49 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Oct 14 04:18:49 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Oct 14 04:18:49 localhost systemd[1]: Stopped rsyslog container. Oct 14 04:18:49 localhost systemd[1]: Starting rsyslog container... Oct 14 04:18:49 localhost systemd[1]: Started libcrun container. Oct 14 04:18:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:49 localhost python3[64319]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:49 localhost podman[64320]: 2025-10-14 08:18:49.766262838 +0000 UTC m=+0.128414791 container init b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, container_name=rsyslog, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, build-date=2025-07-21T12:58:40, managed_by=tripleo_ansible, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Oct 14 04:18:49 localhost podman[64320]: 2025-10-14 08:18:49.777393836 +0000 UTC m=+0.139545789 container start b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vcs-type=git, version=17.1.9, container_name=rsyslog, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-rsyslog, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, distribution-scope=public, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., build-date=2025-07-21T12:58:40) Oct 14 04:18:49 localhost podman[64320]: rsyslog Oct 14 04:18:49 localhost systemd[1]: Started rsyslog container. Oct 14 04:18:49 localhost systemd[1]: libpod-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:49 localhost podman[64340]: 2025-10-14 08:18:49.941803513 +0000 UTC m=+0.046259170 container died b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, build-date=2025-07-21T12:58:40, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, release=1, tcib_managed=true, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.33.12) Oct 14 04:18:49 localhost podman[64340]: 2025-10-14 08:18:49.968626532 +0000 UTC m=+0.073082139 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, managed_by=tripleo_ansible, release=1, architecture=x86_64, version=17.1.9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T12:58:40, vcs-type=git) Oct 14 04:18:49 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:18:50 localhost podman[64353]: 2025-10-14 08:18:50.061967983 +0000 UTC m=+0.064302433 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, build-date=2025-07-21T12:58:40, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Oct 14 04:18:50 localhost podman[64353]: rsyslog Oct 14 04:18:50 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Oct 14 04:18:50 localhost systemd[1]: var-lib-containers-storage-overlay-3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4-merged.mount: Deactivated successfully. Oct 14 04:18:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0-userdata-shm.mount: Deactivated successfully. Oct 14 04:18:50 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Oct 14 04:18:50 localhost systemd[1]: Stopped rsyslog container. Oct 14 04:18:50 localhost systemd[1]: Starting rsyslog container... Oct 14 04:18:50 localhost systemd[1]: Started libcrun container. Oct 14 04:18:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:50 localhost podman[64412]: 2025-10-14 08:18:50.457477175 +0000 UTC m=+0.126930765 container init b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, architecture=x86_64, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T12:58:40, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1) Oct 14 04:18:50 localhost podman[64412]: 2025-10-14 08:18:50.464319198 +0000 UTC m=+0.133772788 container start b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-rsyslog, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, managed_by=tripleo_ansible, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team) Oct 14 04:18:50 localhost podman[64412]: rsyslog Oct 14 04:18:50 localhost systemd[1]: Started rsyslog container. Oct 14 04:18:50 localhost systemd[1]: libpod-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:50 localhost podman[64447]: 2025-10-14 08:18:50.6167614 +0000 UTC m=+0.071879281 container died b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, name=rhosp17/openstack-rsyslog, version=17.1.9, io.buildah.version=1.33.12, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T12:58:40, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_step3) Oct 14 04:18:50 localhost podman[64447]: 2025-10-14 08:18:50.641504104 +0000 UTC m=+0.096621915 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-07-21T12:58:40, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, name=rhosp17/openstack-rsyslog, tcib_managed=true, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc.) Oct 14 04:18:50 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:18:50 localhost podman[64475]: 2025-10-14 08:18:50.742266269 +0000 UTC m=+0.065187412 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, build-date=2025-07-21T12:58:40, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20250721.1, container_name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc.) Oct 14 04:18:50 localhost podman[64475]: rsyslog Oct 14 04:18:50 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Oct 14 04:18:50 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Oct 14 04:18:50 localhost systemd[1]: Stopped rsyslog container. Oct 14 04:18:50 localhost systemd[1]: Starting rsyslog container... Oct 14 04:18:51 localhost systemd[1]: Started libcrun container. Oct 14 04:18:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:51 localhost podman[64504]: 2025-10-14 08:18:51.028773818 +0000 UTC m=+0.116531219 container init b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, tcib_managed=true, build-date=2025-07-21T12:58:40, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, release=1, architecture=x86_64, config_id=tripleo_step3, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1) Oct 14 04:18:51 localhost podman[64504]: 2025-10-14 08:18:51.037242543 +0000 UTC m=+0.124999944 container start b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-rsyslog, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, build-date=2025-07-21T12:58:40, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Oct 14 04:18:51 localhost podman[64504]: rsyslog Oct 14 04:18:51 localhost systemd[1]: Started rsyslog container. Oct 14 04:18:51 localhost systemd[1]: libpod-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:51 localhost podman[64555]: 2025-10-14 08:18:51.210781654 +0000 UTC m=+0.059124992 container died b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, batch=17.1_20250721.1, io.buildah.version=1.33.12) Oct 14 04:18:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0-userdata-shm.mount: Deactivated successfully. Oct 14 04:18:51 localhost systemd[1]: var-lib-containers-storage-overlay-3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4-merged.mount: Deactivated successfully. Oct 14 04:18:51 localhost podman[64555]: 2025-10-14 08:18:51.235608381 +0000 UTC m=+0.083951719 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, architecture=x86_64, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, version=17.1.9, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Oct 14 04:18:51 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:18:51 localhost podman[64572]: 2025-10-14 08:18:51.329993166 +0000 UTC m=+0.062748875 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=rsyslog, build-date=2025-07-21T12:58:40, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-rsyslog, distribution-scope=public, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible) Oct 14 04:18:51 localhost podman[64572]: rsyslog Oct 14 04:18:51 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Oct 14 04:18:51 localhost python3[64566]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005486733 step=3 update_config_hash_only=False Oct 14 04:18:51 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Oct 14 04:18:51 localhost systemd[1]: Stopped rsyslog container. Oct 14 04:18:51 localhost systemd[1]: Starting rsyslog container... Oct 14 04:18:51 localhost systemd[1]: Started libcrun container. Oct 14 04:18:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Oct 14 04:18:51 localhost podman[64598]: 2025-10-14 08:18:51.796168808 +0000 UTC m=+0.131440224 container init b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, container_name=rsyslog, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-rsyslog-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, name=rhosp17/openstack-rsyslog) Oct 14 04:18:51 localhost podman[64598]: 2025-10-14 08:18:51.805828641 +0000 UTC m=+0.141100007 container start b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, build-date=2025-07-21T12:58:40, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, io.buildah.version=1.33.12, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, container_name=rsyslog, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, config_id=tripleo_step3) Oct 14 04:18:51 localhost podman[64598]: rsyslog Oct 14 04:18:51 localhost systemd[1]: Started rsyslog container. Oct 14 04:18:51 localhost python3[64611]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:18:51 localhost systemd[1]: libpod-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0.scope: Deactivated successfully. Oct 14 04:18:51 localhost podman[64621]: 2025-10-14 08:18:51.952249184 +0000 UTC m=+0.043519443 container died b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, batch=17.1_20250721.1, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, version=17.1.9, architecture=x86_64, tcib_managed=true, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, vendor=Red Hat, Inc., build-date=2025-07-21T12:58:40, release=1) Oct 14 04:18:51 localhost podman[64621]: 2025-10-14 08:18:51.977431262 +0000 UTC m=+0.068701471 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, build-date=2025-07-21T12:58:40, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc.) Oct 14 04:18:51 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:18:52 localhost podman[64639]: 2025-10-14 08:18:52.065600933 +0000 UTC m=+0.059866546 container cleanup b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vendor=Red Hat, Inc., container_name=rsyslog, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, build-date=2025-07-21T12:58:40, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dda1083e68f30de2da9a23107b96824d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true) Oct 14 04:18:52 localhost podman[64639]: rsyslog Oct 14 04:18:52 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Oct 14 04:18:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9eb3b21ea34563206e7df64a670b1c29ec5b7333233a277ed5563ce6b2a51a0-userdata-shm.mount: Deactivated successfully. Oct 14 04:18:52 localhost python3[64659]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Oct 14 04:18:52 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Oct 14 04:18:52 localhost systemd[1]: Stopped rsyslog container. Oct 14 04:18:52 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Oct 14 04:18:52 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Oct 14 04:18:52 localhost systemd[1]: Failed to start rsyslog container. Oct 14 04:18:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:18:53 localhost podman[64660]: 2025-10-14 08:18:53.744125475 +0000 UTC m=+0.090129903 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, release=2, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, version=17.1.9) Oct 14 04:18:53 localhost podman[64660]: 2025-10-14 08:18:53.759035731 +0000 UTC m=+0.105040149 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_id=tripleo_step3, release=2, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b) Oct 14 04:18:53 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:18:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:18:55 localhost podman[64681]: 2025-10-14 08:18:55.741386714 +0000 UTC m=+0.080862543 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64) Oct 14 04:18:55 localhost podman[64681]: 2025-10-14 08:18:55.751240961 +0000 UTC m=+0.090716770 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, distribution-scope=public, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:18:55 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:19:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:19:08 localhost systemd[1]: tmp-crun.yksYi3.mount: Deactivated successfully. Oct 14 04:19:08 localhost podman[64700]: 2025-10-14 08:19:08.74606334 +0000 UTC m=+0.087864221 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:19:08 localhost podman[64700]: 2025-10-14 08:19:08.933177547 +0000 UTC m=+0.274978418 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:19:08 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:19:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:19:24 localhost podman[64807]: 2025-10-14 08:19:24.760021124 +0000 UTC m=+0.093733074 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.33.12, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible) Oct 14 04:19:24 localhost podman[64807]: 2025-10-14 08:19:24.771148722 +0000 UTC m=+0.104860662 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, release=2, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Oct 14 04:19:24 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:19:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:19:26 localhost podman[64828]: 2025-10-14 08:19:26.749613464 +0000 UTC m=+0.091366282 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., container_name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public) Oct 14 04:19:26 localhost podman[64828]: 2025-10-14 08:19:26.788141379 +0000 UTC m=+0.129894177 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:27:15, distribution-scope=public) Oct 14 04:19:26 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:19:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:19:39 localhost systemd[1]: tmp-crun.9OFuK2.mount: Deactivated successfully. Oct 14 04:19:39 localhost podman[64846]: 2025-10-14 08:19:39.730889297 +0000 UTC m=+0.073776860 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, vcs-type=git, version=17.1.9, name=rhosp17/openstack-qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1) Oct 14 04:19:39 localhost podman[64846]: 2025-10-14 08:19:39.941763118 +0000 UTC m=+0.284650671 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container) Oct 14 04:19:39 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:19:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:19:55 localhost systemd[1]: tmp-crun.q8jjpP.mount: Deactivated successfully. Oct 14 04:19:55 localhost podman[64875]: 2025-10-14 08:19:55.741741923 +0000 UTC m=+0.084435394 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:19:55 localhost podman[64875]: 2025-10-14 08:19:55.777279335 +0000 UTC m=+0.119972836 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:19:55 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:19:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:19:57 localhost podman[64895]: 2025-10-14 08:19:57.735974576 +0000 UTC m=+0.080353206 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=) Oct 14 04:19:57 localhost podman[64895]: 2025-10-14 08:19:57.769076972 +0000 UTC m=+0.113455342 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, batch=17.1_20250721.1, container_name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:19:57 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:20:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:20:10 localhost podman[64929]: 2025-10-14 08:20:10.620745439 +0000 UTC m=+0.075269823 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, release=1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:20:10 localhost podman[64929]: 2025-10-14 08:20:10.840275742 +0000 UTC m=+0.294800086 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Oct 14 04:20:10 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:20:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:20:26 localhost systemd[1]: tmp-crun.Dl9Y9j.mount: Deactivated successfully. Oct 14 04:20:26 localhost podman[65019]: 2025-10-14 08:20:26.74260904 +0000 UTC m=+0.086401175 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, release=2, container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, vcs-type=git) Oct 14 04:20:26 localhost podman[65019]: 2025-10-14 08:20:26.752174174 +0000 UTC m=+0.095966279 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:20:26 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:20:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:20:28 localhost systemd[1]: tmp-crun.QIIzuZ.mount: Deactivated successfully. Oct 14 04:20:28 localhost podman[65040]: 2025-10-14 08:20:28.742629224 +0000 UTC m=+0.086121537 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-iscsid, container_name=iscsid, distribution-scope=public, build-date=2025-07-21T13:27:15) Oct 14 04:20:28 localhost podman[65040]: 2025-10-14 08:20:28.776584507 +0000 UTC m=+0.120076800 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:20:28 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:20:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:20:41 localhost podman[65060]: 2025-10-14 08:20:41.751449764 +0000 UTC m=+0.095397442 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9) Oct 14 04:20:41 localhost podman[65060]: 2025-10-14 08:20:41.929797991 +0000 UTC m=+0.273745689 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:20:41 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:20:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:20:57 localhost podman[65088]: 2025-10-14 08:20:57.713749922 +0000 UTC m=+0.060777638 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, release=2, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Oct 14 04:20:57 localhost podman[65088]: 2025-10-14 08:20:57.751362147 +0000 UTC m=+0.098389813 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=2, vcs-type=git, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9) Oct 14 04:20:57 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:20:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:20:59 localhost podman[65108]: 2025-10-14 08:20:59.740812676 +0000 UTC m=+0.083544338 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, container_name=iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:20:59 localhost podman[65108]: 2025-10-14 08:20:59.753093914 +0000 UTC m=+0.095825596 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team) Oct 14 04:20:59 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:21:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:21:12 localhost systemd[1]: tmp-crun.2xK7VT.mount: Deactivated successfully. Oct 14 04:21:12 localhost podman[65142]: 2025-10-14 08:21:12.184142263 +0000 UTC m=+0.080518655 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1) Oct 14 04:21:12 localhost podman[65142]: 2025-10-14 08:21:12.382143315 +0000 UTC m=+0.278519717 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9) Oct 14 04:21:12 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:21:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:21:28 localhost systemd[1]: tmp-crun.Ftq3uJ.mount: Deactivated successfully. Oct 14 04:21:28 localhost podman[65233]: 2025-10-14 08:21:28.709253168 +0000 UTC m=+0.057349123 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, version=17.1.9, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, release=2, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:21:28 localhost podman[65233]: 2025-10-14 08:21:28.716762429 +0000 UTC m=+0.064858394 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.9, release=2, batch=17.1_20250721.1) Oct 14 04:21:28 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:21:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:21:30 localhost systemd[1]: tmp-crun.PSkJ9Z.mount: Deactivated successfully. Oct 14 04:21:30 localhost podman[65254]: 2025-10-14 08:21:30.721661872 +0000 UTC m=+0.069716063 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, release=1, vcs-type=git, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:21:30 localhost podman[65254]: 2025-10-14 08:21:30.735097595 +0000 UTC m=+0.083151776 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, architecture=x86_64, container_name=iscsid, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=) Oct 14 04:21:30 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:21:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:21:42 localhost podman[65272]: 2025-10-14 08:21:42.725823691 +0000 UTC m=+0.069109114 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:21:42 localhost podman[65272]: 2025-10-14 08:21:42.940899707 +0000 UTC m=+0.284185110 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, batch=17.1_20250721.1, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:21:42 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:21:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:21:59 localhost podman[65301]: 2025-10-14 08:21:59.735027307 +0000 UTC m=+0.080417561 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, distribution-scope=public, release=2, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64) Oct 14 04:21:59 localhost podman[65301]: 2025-10-14 08:21:59.748074308 +0000 UTC m=+0.093464522 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20250721.1, release=2, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vendor=Red Hat, Inc., container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b) Oct 14 04:21:59 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:22:01 localhost systemd[1]: tmp-crun.oujH3c.mount: Deactivated successfully. Oct 14 04:22:01 localhost podman[65320]: 2025-10-14 08:22:01.73690838 +0000 UTC m=+0.076945586 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:22:01 localhost podman[65320]: 2025-10-14 08:22:01.774001519 +0000 UTC m=+0.114038695 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.33.12, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:22:01 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:22:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:22:13 localhost systemd[1]: tmp-crun.xqjlZv.mount: Deactivated successfully. Oct 14 04:22:13 localhost podman[65339]: 2025-10-14 08:22:13.758092822 +0000 UTC m=+0.102661806 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, distribution-scope=public, release=1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1) Oct 14 04:22:13 localhost podman[65339]: 2025-10-14 08:22:13.964575414 +0000 UTC m=+0.309144408 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:22:13 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:22:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:22:30 localhost systemd[1]: tmp-crun.cvHkof.mount: Deactivated successfully. Oct 14 04:22:30 localhost podman[65497]: 2025-10-14 08:22:30.730768977 +0000 UTC m=+0.069869468 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git) Oct 14 04:22:30 localhost podman[65497]: 2025-10-14 08:22:30.740385072 +0000 UTC m=+0.079485573 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, version=17.1.9, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, distribution-scope=public) Oct 14 04:22:30 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:22:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:22:32 localhost podman[65516]: 2025-10-14 08:22:32.748039631 +0000 UTC m=+0.091452971 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, release=1, io.openshift.expose-services=, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12) Oct 14 04:22:32 localhost podman[65516]: 2025-10-14 08:22:32.758579374 +0000 UTC m=+0.101992704 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, architecture=x86_64, container_name=iscsid, release=1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:22:32 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:22:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:22:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4460 writes, 20K keys, 4460 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4460 writes, 458 syncs, 9.74 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 340 writes, 780 keys, 340 commit groups, 1.0 writes per commit group, ingest: 0.60 MB, 0.00 MB/s#012Interval WAL: 340 writes, 168 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:22:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:22:44 localhost podman[65536]: 2025-10-14 08:22:44.736817217 +0000 UTC m=+0.082347890 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1) Oct 14 04:22:44 localhost podman[65536]: 2025-10-14 08:22:44.945787677 +0000 UTC m=+0.291318340 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 04:22:44 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:22:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:22:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 5034 writes, 22K keys, 5034 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5034 writes, 570 syncs, 8.83 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 382 writes, 802 keys, 382 commit groups, 1.0 writes per commit group, ingest: 0.54 MB, 0.00 MB/s#012Interval WAL: 382 writes, 190 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:22:51 localhost python3[65612]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:22:52 localhost python3[65657]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430171.229459-108137-849072837942/source _original_basename=tmp0rcv4t1o follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:22:53 localhost python3[65719]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:22:53 localhost python3[65762]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430173.0762224-108290-55041237216335/source _original_basename=tmp16ig2ue5 follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:22:54 localhost python3[65824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:22:54 localhost python3[65867]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430174.1130922-108441-56145526243383/source _original_basename=tmpoveeiagh follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:22:55 localhost python3[65929]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:22:55 localhost python3[65972]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430175.0707817-108498-176155615533957/source _original_basename=tmp6joj0twg follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:22:56 localhost python3[66002]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Oct 14 04:22:56 localhost systemd[1]: Reloading. Oct 14 04:22:56 localhost systemd-rc-local-generator[66030]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:22:56 localhost systemd-sysv-generator[66033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:22:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:22:57 localhost systemd[1]: Reloading. Oct 14 04:22:57 localhost systemd-sysv-generator[66067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:22:57 localhost systemd-rc-local-generator[66063]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:22:57 localhost python3[66092]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:22:57 localhost systemd[1]: Reloading. Oct 14 04:22:58 localhost systemd-rc-local-generator[66116]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:22:58 localhost systemd-sysv-generator[66121]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:22:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:22:58 localhost systemd[1]: Reloading. Oct 14 04:22:58 localhost systemd-rc-local-generator[66154]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:22:58 localhost systemd-sysv-generator[66158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:22:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:22:58 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Oct 14 04:22:59 localhost python3[66184]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:22:59 localhost systemd[1]: Reloading. Oct 14 04:22:59 localhost systemd-rc-local-generator[66212]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:22:59 localhost systemd-sysv-generator[66216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:22:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:22:59 localhost python3[66268]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:00 localhost python3[66311]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430179.4899876-108596-46640591704334/source _original_basename=tmptx6wk9i9 follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:00 localhost python3[66341]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:00 localhost systemd[1]: Reloading. Oct 14 04:23:00 localhost systemd-rc-local-generator[66367]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:00 localhost systemd-sysv-generator[66372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:23:01 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Oct 14 04:23:01 localhost podman[66379]: 2025-10-14 08:23:01.211845486 +0000 UTC m=+0.097077823 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, io.buildah.version=1.33.12, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, version=17.1.9) Oct 14 04:23:01 localhost podman[66379]: 2025-10-14 08:23:01.22759755 +0000 UTC m=+0.112829867 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:01 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:23:01 localhost python3[66417]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:23:03 localhost systemd[1]: tmp-crun.GxqXSO.mount: Deactivated successfully. Oct 14 04:23:03 localhost podman[66590]: 2025-10-14 08:23:03.239843369 +0000 UTC m=+0.095268757 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:03 localhost podman[66590]: 2025-10-14 08:23:03.272554194 +0000 UTC m=+0.127979582 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:03 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:23:03 localhost ansible-async_wrapper.py[66589]: Invoked with 888329571133 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430182.7515893-108709-196188699161659/AnsiballZ_command.py _ Oct 14 04:23:03 localhost ansible-async_wrapper.py[66611]: Starting module and watcher Oct 14 04:23:03 localhost ansible-async_wrapper.py[66611]: Start watching 66612 (3600) Oct 14 04:23:03 localhost ansible-async_wrapper.py[66612]: Start module (66612) Oct 14 04:23:03 localhost ansible-async_wrapper.py[66589]: Return async_wrapper task started. Oct 14 04:23:03 localhost python3[66632]: ansible-ansible.legacy.async_status Invoked with jid=888329571133.66589 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:23:06 localhost puppet-user[66631]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:23:06 localhost puppet-user[66631]: (file: /etc/puppet/hiera.yaml) Oct 14 04:23:06 localhost puppet-user[66631]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:23:06 localhost puppet-user[66631]: (file & line not available) Oct 14 04:23:07 localhost puppet-user[66631]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:23:07 localhost puppet-user[66631]: (file & line not available) Oct 14 04:23:07 localhost puppet-user[66631]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Oct 14 04:23:07 localhost puppet-user[66631]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:23:07 localhost puppet-user[66631]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:23:07 localhost puppet-user[66631]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:23:07 localhost puppet-user[66631]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:23:07 localhost puppet-user[66631]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:23:07 localhost puppet-user[66631]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:23:07 localhost puppet-user[66631]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:23:07 localhost puppet-user[66631]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:23:07 localhost puppet-user[66631]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:23:07 localhost puppet-user[66631]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:23:07 localhost puppet-user[66631]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:23:07 localhost puppet-user[66631]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:23:07 localhost puppet-user[66631]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:23:07 localhost puppet-user[66631]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:23:07 localhost puppet-user[66631]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:23:07 localhost puppet-user[66631]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:23:07 localhost puppet-user[66631]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:23:07 localhost puppet-user[66631]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Oct 14 04:23:07 localhost puppet-user[66631]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.21 seconds Oct 14 04:23:08 localhost ansible-async_wrapper.py[66611]: 66612 still running (3600) Oct 14 04:23:13 localhost ansible-async_wrapper.py[66611]: 66612 still running (3595) Oct 14 04:23:13 localhost python3[66832]: ansible-ansible.legacy.async_status Invoked with jid=888329571133.66589 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:23:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:23:15 localhost podman[66836]: 2025-10-14 08:23:15.279521678 +0000 UTC m=+0.100071985 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, managed_by=tripleo_ansible, batch=17.1_20250721.1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:23:15 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 04:23:15 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 04:23:15 localhost systemd[1]: Reloading. Oct 14 04:23:15 localhost podman[66836]: 2025-10-14 08:23:15.46778508 +0000 UTC m=+0.288335387 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true) Oct 14 04:23:15 localhost systemd-rc-local-generator[66939]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:15 localhost systemd-sysv-generator[66945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:15 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:23:15 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 04:23:16 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 04:23:16 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 04:23:16 localhost systemd[1]: man-db-cache-update.service: Consumed 1.283s CPU time. Oct 14 04:23:16 localhost systemd[1]: run-r38aba43167ba44fca2a9c54960528dad.service: Deactivated successfully. Oct 14 04:23:17 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Oct 14 04:23:17 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}1dfae865846e2643dfb62a1f8b0d38724087c54d0c8cd78450bd8701f86240ac' Oct 14 04:23:17 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Oct 14 04:23:17 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Oct 14 04:23:17 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Oct 14 04:23:17 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Oct 14 04:23:18 localhost ansible-async_wrapper.py[66611]: 66612 still running (3590) Oct 14 04:23:22 localhost puppet-user[66631]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Oct 14 04:23:22 localhost systemd[1]: Reloading. Oct 14 04:23:22 localhost systemd-rc-local-generator[67993]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:22 localhost systemd-sysv-generator[67997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:22 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Oct 14 04:23:22 localhost snmpd[68005]: Can't find directory of RPM packages Oct 14 04:23:22 localhost snmpd[68005]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Oct 14 04:23:22 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Oct 14 04:23:23 localhost systemd[1]: Reloading. Oct 14 04:23:23 localhost systemd-sysv-generator[68062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:23 localhost systemd-rc-local-generator[68058]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:23 localhost ansible-async_wrapper.py[66611]: 66612 still running (3585) Oct 14 04:23:23 localhost systemd[1]: Reloading. Oct 14 04:23:23 localhost systemd-rc-local-generator[68098]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:23 localhost systemd-sysv-generator[68101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:23 localhost puppet-user[66631]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Oct 14 04:23:23 localhost puppet-user[66631]: Notice: Applied catalog in 16.45 seconds Oct 14 04:23:23 localhost puppet-user[66631]: Application: Oct 14 04:23:23 localhost puppet-user[66631]: Initial environment: production Oct 14 04:23:23 localhost puppet-user[66631]: Converged environment: production Oct 14 04:23:23 localhost puppet-user[66631]: Run mode: user Oct 14 04:23:23 localhost puppet-user[66631]: Changes: Oct 14 04:23:23 localhost puppet-user[66631]: Total: 8 Oct 14 04:23:23 localhost puppet-user[66631]: Events: Oct 14 04:23:23 localhost puppet-user[66631]: Success: 8 Oct 14 04:23:23 localhost puppet-user[66631]: Total: 8 Oct 14 04:23:23 localhost puppet-user[66631]: Resources: Oct 14 04:23:23 localhost puppet-user[66631]: Restarted: 1 Oct 14 04:23:23 localhost puppet-user[66631]: Changed: 8 Oct 14 04:23:23 localhost puppet-user[66631]: Out of sync: 8 Oct 14 04:23:23 localhost puppet-user[66631]: Total: 19 Oct 14 04:23:23 localhost puppet-user[66631]: Time: Oct 14 04:23:23 localhost puppet-user[66631]: Filebucket: 0.00 Oct 14 04:23:23 localhost puppet-user[66631]: Schedule: 0.00 Oct 14 04:23:23 localhost puppet-user[66631]: Augeas: 0.01 Oct 14 04:23:23 localhost puppet-user[66631]: File: 0.09 Oct 14 04:23:23 localhost puppet-user[66631]: Config retrieval: 0.27 Oct 14 04:23:23 localhost puppet-user[66631]: Service: 1.30 Oct 14 04:23:23 localhost puppet-user[66631]: Transaction evaluation: 16.44 Oct 14 04:23:23 localhost puppet-user[66631]: Catalog application: 16.45 Oct 14 04:23:23 localhost puppet-user[66631]: Last run: 1760430203 Oct 14 04:23:23 localhost puppet-user[66631]: Exec: 5.06 Oct 14 04:23:23 localhost puppet-user[66631]: Package: 9.83 Oct 14 04:23:23 localhost puppet-user[66631]: Total: 16.45 Oct 14 04:23:23 localhost puppet-user[66631]: Version: Oct 14 04:23:23 localhost puppet-user[66631]: Config: 1760430186 Oct 14 04:23:23 localhost puppet-user[66631]: Puppet: 7.10.0 Oct 14 04:23:23 localhost ansible-async_wrapper.py[66612]: Module complete (66612) Oct 14 04:23:24 localhost python3[68173]: ansible-ansible.legacy.async_status Invoked with jid=888329571133.66589 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:23:24 localhost python3[68239]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:23:25 localhost python3[68275]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:25 localhost podman[68304]: Oct 14 04:23:25 localhost podman[68304]: 2025-10-14 08:23:25.332261273 +0000 UTC m=+0.063126719 container create 3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_thompson, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, release=553, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Oct 14 04:23:25 localhost systemd[1]: Started libpod-conmon-3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72.scope. Oct 14 04:23:25 localhost systemd[1]: Started libcrun container. Oct 14 04:23:25 localhost podman[68304]: 2025-10-14 08:23:25.402046077 +0000 UTC m=+0.132911523 container init 3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_thompson, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, version=7, name=rhceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, ceph=True, vcs-type=git) Oct 14 04:23:25 localhost podman[68304]: 2025-10-14 08:23:25.302774018 +0000 UTC m=+0.033639494 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 04:23:25 localhost systemd[1]: tmp-crun.Povc1P.mount: Deactivated successfully. Oct 14 04:23:25 localhost podman[68304]: 2025-10-14 08:23:25.414306764 +0000 UTC m=+0.145172210 container start 3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_thompson, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 04:23:25 localhost podman[68304]: 2025-10-14 08:23:25.414562062 +0000 UTC m=+0.145427518 container attach 3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_thompson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.buildah.version=1.33.12, name=rhceph, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Oct 14 04:23:25 localhost objective_thompson[68319]: 167 167 Oct 14 04:23:25 localhost systemd[1]: libpod-3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72.scope: Deactivated successfully. Oct 14 04:23:25 localhost podman[68304]: 2025-10-14 08:23:25.41744477 +0000 UTC m=+0.148310206 container died 3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_thompson, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.33.12) Oct 14 04:23:25 localhost podman[68337]: 2025-10-14 08:23:25.498712377 +0000 UTC m=+0.075293394 container remove 3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_thompson, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7) Oct 14 04:23:25 localhost systemd[1]: libpod-conmon-3c349f6608085bee947a60ebdb5e56c0135963845a13330799c830868f18cc72.scope: Deactivated successfully. Oct 14 04:23:25 localhost podman[68394]: Oct 14 04:23:25 localhost podman[68394]: 2025-10-14 08:23:25.699229185 +0000 UTC m=+0.074481258 container create ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_elbakyan, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=) Oct 14 04:23:25 localhost systemd[1]: Started libpod-conmon-ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032.scope. Oct 14 04:23:25 localhost python3[68388]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:25 localhost systemd[1]: Started libcrun container. Oct 14 04:23:25 localhost podman[68394]: 2025-10-14 08:23:25.664662184 +0000 UTC m=+0.039914347 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 04:23:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e932d8b512ca7d21f2331eb1229c4517cc8059fd5e2dc04f90a0f4ce4c947396/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e932d8b512ca7d21f2331eb1229c4517cc8059fd5e2dc04f90a0f4ce4c947396/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e932d8b512ca7d21f2331eb1229c4517cc8059fd5e2dc04f90a0f4ce4c947396/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:25 localhost podman[68394]: 2025-10-14 08:23:25.773949581 +0000 UTC m=+0.149201654 container init ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_elbakyan, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, version=7, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True) Oct 14 04:23:25 localhost podman[68394]: 2025-10-14 08:23:25.784583638 +0000 UTC m=+0.159835711 container start ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_elbakyan, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553) Oct 14 04:23:25 localhost podman[68394]: 2025-10-14 08:23:25.784826855 +0000 UTC m=+0.160078928 container attach ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_elbakyan, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55) Oct 14 04:23:26 localhost python3[68431]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp56_7e_qe recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:23:26 localhost systemd[1]: var-lib-containers-storage-overlay-c16977602fc2db5eb8fee7f44fea6e9b8d6e39ca2cddd602e1b035845ef4604f-merged.mount: Deactivated successfully. Oct 14 04:23:26 localhost python3[68591]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: [ Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: { Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "available": false, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "ceph_device": false, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "lsm_data": {}, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "lvs": [], Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "path": "/dev/sr0", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "rejected_reasons": [ Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "Has a FileSystem", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "Insufficient space (<5GB)" Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: ], Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "sys_api": { Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "actuators": null, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "device_nodes": "sr0", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "human_readable_size": "482.00 KB", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "id_bus": "ata", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "model": "QEMU DVD-ROM", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "nr_requests": "2", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "partitions": {}, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "path": "/dev/sr0", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "removable": "1", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "rev": "2.5+", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "ro": "0", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "rotational": "1", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "sas_address": "", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "sas_device_handle": "", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "scheduler_mode": "mq-deadline", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "sectors": 0, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "sectorsize": "2048", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "size": 493568.0, Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "support_discard": "0", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "type": "disk", Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: "vendor": "QEMU" Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: } Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: } Oct 14 04:23:26 localhost peaceful_elbakyan[68409]: ] Oct 14 04:23:26 localhost systemd[1]: libpod-ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032.scope: Deactivated successfully. Oct 14 04:23:26 localhost systemd[1]: libpod-ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032.scope: Consumed 1.021s CPU time. Oct 14 04:23:26 localhost podman[68394]: 2025-10-14 08:23:26.777894808 +0000 UTC m=+1.153146881 container died ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_elbakyan, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, maintainer=Guillaume Abrioux , RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Oct 14 04:23:26 localhost systemd[1]: tmp-crun.69jNa6.mount: Deactivated successfully. Oct 14 04:23:26 localhost systemd[1]: var-lib-containers-storage-overlay-e932d8b512ca7d21f2331eb1229c4517cc8059fd5e2dc04f90a0f4ce4c947396-merged.mount: Deactivated successfully. Oct 14 04:23:26 localhost podman[70298]: 2025-10-14 08:23:26.888489986 +0000 UTC m=+0.093516234 container remove ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_elbakyan, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=553, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7) Oct 14 04:23:26 localhost systemd[1]: libpod-conmon-ca8989c949f048361218c55f876dbac23e8123d6a3af18776ca780ff7bf61032.scope: Deactivated successfully. Oct 14 04:23:27 localhost python3[70398]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Oct 14 04:23:28 localhost ansible-async_wrapper.py[66611]: Done in kid B. Oct 14 04:23:28 localhost python3[70432]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:29 localhost python3[70464]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:29 localhost python3[70514]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:30 localhost python3[70532]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:30 localhost python3[70594]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:31 localhost python3[70612]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:23:31 localhost podman[70675]: 2025-10-14 08:23:31.641965187 +0000 UTC m=+0.102188501 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc.) Oct 14 04:23:31 localhost podman[70675]: 2025-10-14 08:23:31.6531249 +0000 UTC m=+0.113348224 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, architecture=x86_64, release=2, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, distribution-scope=public, build-date=2025-07-21T13:04:03, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc.) Oct 14 04:23:31 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:23:31 localhost python3[70674]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:31 localhost python3[70713]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:32 localhost python3[70775]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:32 localhost python3[70793]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:33 localhost python3[70823]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:23:33 localhost systemd[1]: Reloading. Oct 14 04:23:33 localhost systemd-sysv-generator[70862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:33 localhost systemd-rc-local-generator[70859]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:33 localhost podman[70825]: 2025-10-14 08:23:33.528079572 +0000 UTC m=+0.108693070 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, container_name=iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, release=1) Oct 14 04:23:33 localhost podman[70825]: 2025-10-14 08:23:33.536694936 +0000 UTC m=+0.117308404 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, release=1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 04:23:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:33 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:23:34 localhost python3[70927]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:34 localhost python3[70945]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:34 localhost python3[71007]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:23:35 localhost python3[71025]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:35 localhost python3[71056]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:35 localhost systemd[1]: Reloading. Oct 14 04:23:35 localhost systemd-sysv-generator[71081]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:35 localhost systemd-rc-local-generator[71078]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:36 localhost systemd[1]: Starting Create netns directory... Oct 14 04:23:36 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 04:23:36 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 04:23:36 localhost systemd[1]: Finished Create netns directory. Oct 14 04:23:36 localhost python3[71112]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Oct 14 04:23:38 localhost python3[71170]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Oct 14 04:23:38 localhost podman[71327]: 2025-10-14 08:23:38.6382792 +0000 UTC m=+0.074045255 container create d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, release=1, config_id=tripleo_step4, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:23:38 localhost podman[71338]: 2025-10-14 08:23:38.657779359 +0000 UTC m=+0.080585697 container create 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=) Oct 14 04:23:38 localhost podman[71357]: 2025-10-14 08:23:38.68287484 +0000 UTC m=+0.078601196 container create f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.9, name=rhosp17/openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12, release=1, container_name=configure_cms_options, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git) Oct 14 04:23:38 localhost systemd[1]: Started libpod-conmon-d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.scope. Oct 14 04:23:38 localhost systemd[1]: Started libpod-conmon-07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.scope. Oct 14 04:23:38 localhost systemd[1]: Started libcrun container. Oct 14 04:23:38 localhost systemd[1]: Started libpod-conmon-f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1.scope. Oct 14 04:23:38 localhost podman[71358]: 2025-10-14 08:23:38.708255749 +0000 UTC m=+0.107027748 container create def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, version=17.1.9, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, release=1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:23:38 localhost systemd[1]: Started libcrun container. Oct 14 04:23:38 localhost podman[71327]: 2025-10-14 08:23:38.608703281 +0000 UTC m=+0.044469506 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Oct 14 04:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93e7bd8fd2f06fab388e6a7e1d321b8be1a1e6e35c116f25c67dac1cc2084007/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab5ae649c8d55ad3b61d788cb1580bb9641b83eaaaa1a012a2f15a639ad5659a/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:38 localhost podman[71338]: 2025-10-14 08:23:38.612251421 +0000 UTC m=+0.035057779 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Oct 14 04:23:38 localhost systemd[1]: Started libcrun container. Oct 14 04:23:38 localhost podman[71357]: 2025-10-14 08:23:38.724999854 +0000 UTC m=+0.120726220 container init f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, container_name=configure_cms_options, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:23:38 localhost podman[71358]: 2025-10-14 08:23:38.632779191 +0000 UTC m=+0.031551190 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Oct 14 04:23:38 localhost podman[71357]: 2025-10-14 08:23:38.633948686 +0000 UTC m=+0.029675062 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Oct 14 04:23:38 localhost podman[71357]: 2025-10-14 08:23:38.735006211 +0000 UTC m=+0.130732567 container start f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, release=1, architecture=x86_64, container_name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:23:38 localhost podman[71357]: 2025-10-14 08:23:38.735181197 +0000 UTC m=+0.130907553 container attach f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=configure_cms_options, vendor=Red Hat, Inc., version=17.1.9) Oct 14 04:23:38 localhost systemd[1]: Started libpod-conmon-def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.scope. Oct 14 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:23:38 localhost podman[71327]: 2025-10-14 08:23:38.746707451 +0000 UTC m=+0.182473546 container init d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=logrotate_crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:38 localhost systemd[1]: Started libcrun container. Oct 14 04:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53d9edaa733ed29c02c610d1bdf1d5de99f647c41b4e60d4ee0bce49725f7b4f/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:23:38 localhost podman[71327]: 2025-10-14 08:23:38.770927564 +0000 UTC m=+0.206693629 container start d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, architecture=x86_64, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, release=1, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.openshift.expose-services=) Oct 14 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:23:38 localhost podman[71358]: 2025-10-14 08:23:38.774214976 +0000 UTC m=+0.172986955 container init def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, release=1, io.openshift.expose-services=, architecture=x86_64) Oct 14 04:23:38 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Oct 14 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:23:38 localhost podman[71358]: 2025-10-14 08:23:38.800435411 +0000 UTC m=+0.199207390 container start def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, build-date=2025-07-21T14:45:33) Oct 14 04:23:38 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0fa4c62fe8881d1f7112b22e9fd9421c --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Oct 14 04:23:38 localhost ovs-vsctl[71475]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Oct 14 04:23:38 localhost podman[71337]: 2025-10-14 08:23:38.876169686 +0000 UTC m=+0.297888130 container create 6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, release=2, managed_by=tripleo_ansible, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_libvirt_init_secret, io.buildah.version=1.33.12) Oct 14 04:23:38 localhost systemd[1]: libpod-f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1.scope: Deactivated successfully. Oct 14 04:23:38 localhost systemd[1]: Started libpod-conmon-6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261.scope. Oct 14 04:23:38 localhost systemd[1]: Started libcrun container. Oct 14 04:23:38 localhost podman[71337]: 2025-10-14 08:23:38.810764628 +0000 UTC m=+0.232483082 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Oct 14 04:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f427123442eee41b986e1fae04c3a7af8dd46dc41159db5ec6876f25e2a2dde/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f427123442eee41b986e1fae04c3a7af8dd46dc41159db5ec6876f25e2a2dde/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f427123442eee41b986e1fae04c3a7af8dd46dc41159db5ec6876f25e2a2dde/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:38 localhost podman[71337]: 2025-10-14 08:23:38.920644743 +0000 UTC m=+0.342363187 container init 6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vendor=Red Hat, Inc., release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, build-date=2025-07-21T14:56:59, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, distribution-scope=public) Oct 14 04:23:38 localhost podman[71357]: 2025-10-14 08:23:38.929451024 +0000 UTC m=+0.325177390 container died f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=configure_cms_options, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4) Oct 14 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:23:38 localhost podman[71338]: 2025-10-14 08:23:38.94786912 +0000 UTC m=+0.370675448 container init 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Oct 14 04:23:38 localhost podman[71425]: 2025-10-14 08:23:38.875732273 +0000 UTC m=+0.099698053 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, version=17.1.9, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true) Oct 14 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:23:38 localhost podman[71338]: 2025-10-14 08:23:38.983082921 +0000 UTC m=+0.405889249 container start 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:23:38 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0fa4c62fe8881d1f7112b22e9fd9421c --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Oct 14 04:23:39 localhost podman[71442]: 2025-10-14 08:23:39.014125744 +0000 UTC m=+0.211480826 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true) Oct 14 04:23:39 localhost podman[71425]: 2025-10-14 08:23:39.04031695 +0000 UTC m=+0.264282700 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12) Oct 14 04:23:39 localhost systemd[1]: libpod-6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261.scope: Deactivated successfully. Oct 14 04:23:39 localhost podman[71442]: 2025-10-14 08:23:39.054799464 +0000 UTC m=+0.252154556 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:23:39 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:23:39 localhost podman[71442]: unhealthy Oct 14 04:23:39 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:23:39 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed with result 'exit-code'. Oct 14 04:23:39 localhost podman[71337]: 2025-10-14 08:23:39.083608519 +0000 UTC m=+0.505326963 container start 6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_libvirt_init_secret, build-date=2025-07-21T14:56:59, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 04:23:39 localhost podman[71337]: 2025-10-14 08:23:39.084002941 +0000 UTC m=+0.505721405 container attach 6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, maintainer=OpenStack TripleO Team, architecture=x86_64, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 04:23:39 localhost podman[71337]: 2025-10-14 08:23:39.08558125 +0000 UTC m=+0.507299704 container died 6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, container_name=nova_libvirt_init_secret, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4) Oct 14 04:23:39 localhost podman[71533]: 2025-10-14 08:23:39.090368256 +0000 UTC m=+0.105280634 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Oct 14 04:23:39 localhost podman[71533]: 2025-10-14 08:23:39.133387968 +0000 UTC m=+0.148300316 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, release=1, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:23:39 localhost podman[71533]: unhealthy Oct 14 04:23:39 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:23:39 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Failed with result 'exit-code'. Oct 14 04:23:39 localhost podman[71480]: 2025-10-14 08:23:39.149915026 +0000 UTC m=+0.260207994 container cleanup f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, version=17.1.9) Oct 14 04:23:39 localhost systemd[1]: libpod-conmon-f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1.scope: Deactivated successfully. Oct 14 04:23:39 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1760428406 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Oct 14 04:23:39 localhost podman[71632]: 2025-10-14 08:23:39.204057509 +0000 UTC m=+0.075345546 container create 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, release=1, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 04:23:39 localhost systemd[1]: Started libpod-conmon-7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.scope. Oct 14 04:23:39 localhost systemd[1]: Started libcrun container. Oct 14 04:23:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f85b4a61f0484c7d6d1230e2bc736bd8398f5346eb8306d97c0ce215dfc5ab2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:39 localhost podman[71581]: 2025-10-14 08:23:39.272805261 +0000 UTC m=+0.217513853 container cleanup 6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, container_name=nova_libvirt_init_secret, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}) Oct 14 04:23:39 localhost podman[71632]: 2025-10-14 08:23:39.174564492 +0000 UTC m=+0.045852539 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:23:39 localhost systemd[1]: libpod-conmon-6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261.scope: Deactivated successfully. Oct 14 04:23:39 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Oct 14 04:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:23:39 localhost podman[71632]: 2025-10-14 08:23:39.284765358 +0000 UTC m=+0.156053395 container init 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20250721.1, tcib_managed=true) Oct 14 04:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:23:39 localhost podman[71632]: 2025-10-14 08:23:39.308264249 +0000 UTC m=+0.179552286 container start 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, release=1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 04:23:39 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:23:39 localhost podman[71698]: 2025-10-14 08:23:39.392387383 +0000 UTC m=+0.082111443 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T14:48:37, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:23:39 localhost podman[71746]: 2025-10-14 08:23:39.452975755 +0000 UTC m=+0.064586915 container create 164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=setup_ovs_manager) Oct 14 04:23:39 localhost systemd[1]: Started libpod-conmon-164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815.scope. Oct 14 04:23:39 localhost systemd[1]: Started libcrun container. Oct 14 04:23:39 localhost podman[71746]: 2025-10-14 08:23:39.507455238 +0000 UTC m=+0.119066398 container init 164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, version=17.1.9, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:23:39 localhost podman[71746]: 2025-10-14 08:23:39.515350131 +0000 UTC m=+0.126961301 container start 164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, distribution-scope=public, container_name=setup_ovs_manager, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-07-21T16:28:53, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:23:39 localhost podman[71746]: 2025-10-14 08:23:39.515579777 +0000 UTC m=+0.127190947 container attach 164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.33.12, container_name=setup_ovs_manager, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:23:39 localhost podman[71746]: 2025-10-14 08:23:39.426202002 +0000 UTC m=+0.037813162 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Oct 14 04:23:39 localhost systemd[1]: var-lib-containers-storage-overlay-3c56646706fff247676980ac78d7924c31221bb364af528e25b8eedf875d177e-merged.mount: Deactivated successfully. Oct 14 04:23:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4dfc463f8efdc63cfa12e2cf12618d28d3ac5d26b3d30b713cd89dd3b958cd1-userdata-shm.mount: Deactivated successfully. Oct 14 04:23:39 localhost podman[71698]: 2025-10-14 08:23:39.734081929 +0000 UTC m=+0.423806039 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:39 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:23:40 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Oct 14 04:23:42 localhost ovs-vsctl[71924]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Oct 14 04:23:42 localhost systemd[1]: libpod-164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815.scope: Deactivated successfully. Oct 14 04:23:42 localhost systemd[1]: libpod-164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815.scope: Consumed 2.687s CPU time. Oct 14 04:23:42 localhost podman[71746]: 2025-10-14 08:23:42.228873261 +0000 UTC m=+2.840484441 container died 164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, tcib_managed=true) Oct 14 04:23:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815-userdata-shm.mount: Deactivated successfully. Oct 14 04:23:42 localhost systemd[1]: var-lib-containers-storage-overlay-9627ceeeadda540cd5b7624a5c18b2d6257f00b841a3ce165dd1352e6e7c53ee-merged.mount: Deactivated successfully. Oct 14 04:23:42 localhost podman[71925]: 2025-10-14 08:23:42.303218485 +0000 UTC m=+0.068428883 container cleanup 164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=setup_ovs_manager, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Oct 14 04:23:42 localhost systemd[1]: libpod-conmon-164ca0277d8e3ccd20202bb0b1b958aa88c4220ab3e7840bbafe0f003e097815.scope: Deactivated successfully. Oct 14 04:23:42 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1760428406 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428406'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Oct 14 04:23:42 localhost podman[72037]: 2025-10-14 08:23:42.691837862 +0000 UTC m=+0.052666559 container create 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, release=1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:42 localhost systemd[1]: Started libpod-conmon-7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.scope. Oct 14 04:23:42 localhost systemd[1]: Started libcrun container. Oct 14 04:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47662028da5721c0c74410535dffb8022fc35a3f39034c6ae555d30a14cd4135/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47662028da5721c0c74410535dffb8022fc35a3f39034c6ae555d30a14cd4135/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47662028da5721c0c74410535dffb8022fc35a3f39034c6ae555d30a14cd4135/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:42 localhost podman[72037]: 2025-10-14 08:23:42.662523332 +0000 UTC m=+0.023352039 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Oct 14 04:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:23:42 localhost podman[72037]: 2025-10-14 08:23:42.76830687 +0000 UTC m=+0.129135617 container init 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, release=1, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T16:28:53, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:23:42 localhost podman[72038]: 2025-10-14 08:23:42.783191648 +0000 UTC m=+0.136164134 container create a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:23:42 localhost podman[72037]: 2025-10-14 08:23:42.797831188 +0000 UTC m=+0.158659905 container start 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, release=1, tcib_managed=true, io.buildah.version=1.33.12, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:23:42 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0a131c335ed9f542ed2a9fb22aa1dfa8 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Oct 14 04:23:42 localhost systemd[1]: Started libpod-conmon-a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.scope. Oct 14 04:23:42 localhost podman[72038]: 2025-10-14 08:23:42.74222723 +0000 UTC m=+0.095199686 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Oct 14 04:23:42 localhost systemd[1]: Started libcrun container. Oct 14 04:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/982298a30930777b3d22fc278491644ea40b8cb3cf87000aab15fb7785d6006b/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/982298a30930777b3d22fc278491644ea40b8cb3cf87000aab15fb7785d6006b/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/982298a30930777b3d22fc278491644ea40b8cb3cf87000aab15fb7785d6006b/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Oct 14 04:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:23:42 localhost podman[72038]: 2025-10-14 08:23:42.882645623 +0000 UTC m=+0.235618079 container init a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, batch=17.1_20250721.1) Oct 14 04:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:23:42 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:23:42 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 04:23:42 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 04:23:42 localhost podman[72038]: 2025-10-14 08:23:42.923433796 +0000 UTC m=+0.276406272 container start a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.9, vcs-type=git, container_name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12) Oct 14 04:23:42 localhost python3[71170]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Oct 14 04:23:42 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 04:23:42 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 04:23:42 localhost podman[72073]: 2025-10-14 08:23:42.964526548 +0000 UTC m=+0.160133870 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:23:43 localhost podman[72073]: 2025-10-14 08:23:43.006057384 +0000 UTC m=+0.201664656 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:23:43 localhost podman[72073]: unhealthy Oct 14 04:23:43 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:23:43 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:23:43 localhost podman[72109]: 2025-10-14 08:23:43.044192315 +0000 UTC m=+0.120931975 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, container_name=ovn_controller, batch=17.1_20250721.1, release=1, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true) Oct 14 04:23:43 localhost podman[72109]: 2025-10-14 08:23:43.085036029 +0000 UTC m=+0.161775759 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, vcs-type=git, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64) Oct 14 04:23:43 localhost podman[72109]: unhealthy Oct 14 04:23:43 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:23:43 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:23:43 localhost systemd[72129]: Queued start job for default target Main User Target. Oct 14 04:23:43 localhost systemd[72129]: Created slice User Application Slice. Oct 14 04:23:43 localhost systemd[72129]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 04:23:43 localhost systemd[72129]: Started Daily Cleanup of User's Temporary Directories. Oct 14 04:23:43 localhost systemd[72129]: Reached target Paths. Oct 14 04:23:43 localhost systemd[72129]: Reached target Timers. Oct 14 04:23:43 localhost systemd[72129]: Starting D-Bus User Message Bus Socket... Oct 14 04:23:43 localhost systemd[72129]: Starting Create User's Volatile Files and Directories... Oct 14 04:23:43 localhost systemd[72129]: Listening on D-Bus User Message Bus Socket. Oct 14 04:23:43 localhost systemd[72129]: Reached target Sockets. Oct 14 04:23:43 localhost systemd[72129]: Finished Create User's Volatile Files and Directories. Oct 14 04:23:43 localhost systemd[72129]: Reached target Basic System. Oct 14 04:23:43 localhost systemd[72129]: Reached target Main User Target. Oct 14 04:23:43 localhost systemd[72129]: Startup finished in 134ms. Oct 14 04:23:43 localhost systemd[1]: Started User Manager for UID 0. Oct 14 04:23:43 localhost systemd[1]: Started Session c9 of User root. Oct 14 04:23:43 localhost systemd[1]: session-c9.scope: Deactivated successfully. Oct 14 04:23:43 localhost kernel: device br-int entered promiscuous mode Oct 14 04:23:43 localhost NetworkManager[5977]: [1760430223.2581] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Oct 14 04:23:43 localhost systemd-udevd[72188]: Network interface NamePolicy= disabled on kernel command line. Oct 14 04:23:43 localhost NetworkManager[5977]: [1760430223.3140] device (genev_sys_6081): carrier: link connected Oct 14 04:23:43 localhost systemd-udevd[72191]: Network interface NamePolicy= disabled on kernel command line. Oct 14 04:23:43 localhost kernel: device genev_sys_6081 entered promiscuous mode Oct 14 04:23:43 localhost NetworkManager[5977]: [1760430223.3142] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Oct 14 04:23:43 localhost python3[72210]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:43 localhost python3[72226]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:44 localhost python3[72242]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:44 localhost python3[72258]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:44 localhost python3[72274]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:44 localhost python3[72293]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:45 localhost python3[72311]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:45 localhost python3[72327]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:45 localhost python3[72345]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:45 localhost python3[72363]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:23:46 localhost systemd[1]: tmp-crun.F8xgT1.mount: Deactivated successfully. Oct 14 04:23:46 localhost podman[72380]: 2025-10-14 08:23:46.066172011 +0000 UTC m=+0.088634304 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, release=1, tcib_managed=true, batch=17.1_20250721.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1) Oct 14 04:23:46 localhost python3[72379]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:46 localhost podman[72380]: 2025-10-14 08:23:46.260807999 +0000 UTC m=+0.283270302 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=metrics_qdr) Oct 14 04:23:46 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:23:46 localhost python3[72423]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:23:46 localhost python3[72485]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430226.4163597-110027-123563269151460/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:47 localhost python3[72514]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430226.4163597-110027-123563269151460/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:47 localhost python3[72543]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430226.4163597-110027-123563269151460/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:48 localhost python3[72572]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430226.4163597-110027-123563269151460/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:49 localhost python3[72601]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430226.4163597-110027-123563269151460/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:49 localhost python3[72630]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430226.4163597-110027-123563269151460/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:49 localhost python3[72646]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 04:23:49 localhost systemd[1]: Reloading. Oct 14 04:23:50 localhost systemd-rc-local-generator[72667]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:50 localhost systemd-sysv-generator[72672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:50 localhost python3[72698]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:50 localhost systemd[1]: Reloading. Oct 14 04:23:51 localhost systemd-rc-local-generator[72724]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:51 localhost systemd-sysv-generator[72727]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:51 localhost systemd[1]: Starting ceilometer_agent_compute container... Oct 14 04:23:51 localhost tripleo-start-podman-container[72738]: Creating additional drop-in dependency for "ceilometer_agent_compute" (def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e) Oct 14 04:23:51 localhost systemd[1]: Reloading. Oct 14 04:23:51 localhost systemd-rc-local-generator[72794]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:51 localhost systemd-sysv-generator[72797]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:51 localhost systemd[1]: Started ceilometer_agent_compute container. Oct 14 04:23:52 localhost python3[72821]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:52 localhost systemd[1]: Reloading. Oct 14 04:23:52 localhost systemd-rc-local-generator[72847]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:52 localhost systemd-sysv-generator[72853]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:52 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Oct 14 04:23:52 localhost systemd[1]: Started ceilometer_agent_ipmi container. Oct 14 04:23:53 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 04:23:53 localhost systemd[72129]: Activating special unit Exit the Session... Oct 14 04:23:53 localhost systemd[72129]: Stopped target Main User Target. Oct 14 04:23:53 localhost systemd[72129]: Stopped target Basic System. Oct 14 04:23:53 localhost systemd[72129]: Stopped target Paths. Oct 14 04:23:53 localhost systemd[72129]: Stopped target Sockets. Oct 14 04:23:53 localhost systemd[72129]: Stopped target Timers. Oct 14 04:23:53 localhost systemd[72129]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 04:23:53 localhost systemd[72129]: Closed D-Bus User Message Bus Socket. Oct 14 04:23:53 localhost systemd[72129]: Stopped Create User's Volatile Files and Directories. Oct 14 04:23:53 localhost systemd[72129]: Removed slice User Application Slice. Oct 14 04:23:53 localhost systemd[72129]: Reached target Shutdown. Oct 14 04:23:53 localhost systemd[72129]: Finished Exit the Session. Oct 14 04:23:53 localhost systemd[72129]: Reached target Exit the Session. Oct 14 04:23:53 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 04:23:53 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 04:23:53 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 04:23:53 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 04:23:53 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 04:23:53 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 04:23:53 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 04:23:53 localhost python3[72891]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:53 localhost systemd[1]: Reloading. Oct 14 04:23:53 localhost systemd-rc-local-generator[72916]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:53 localhost systemd-sysv-generator[72920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:53 localhost systemd[1]: Starting logrotate_crond container... Oct 14 04:23:53 localhost systemd[1]: Started logrotate_crond container. Oct 14 04:23:54 localhost python3[72959]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:54 localhost systemd[1]: Reloading. Oct 14 04:23:54 localhost systemd-rc-local-generator[72990]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:54 localhost systemd-sysv-generator[72993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:54 localhost systemd[1]: Starting nova_migration_target container... Oct 14 04:23:54 localhost systemd[1]: Started nova_migration_target container. Oct 14 04:23:55 localhost python3[73027]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:55 localhost systemd[1]: Reloading. Oct 14 04:23:55 localhost systemd-rc-local-generator[73053]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:55 localhost systemd-sysv-generator[73057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:56 localhost systemd[1]: Starting ovn_controller container... Oct 14 04:23:56 localhost tripleo-start-podman-container[73067]: Creating additional drop-in dependency for "ovn_controller" (a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5) Oct 14 04:23:56 localhost systemd[1]: Reloading. Oct 14 04:23:56 localhost systemd-rc-local-generator[73126]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:56 localhost systemd-sysv-generator[73129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:56 localhost systemd[1]: Started ovn_controller container. Oct 14 04:23:57 localhost python3[73151]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:23:57 localhost systemd[1]: Reloading. Oct 14 04:23:57 localhost systemd-sysv-generator[73180]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:23:57 localhost systemd-rc-local-generator[73176]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:23:57 localhost systemd[1]: Starting ovn_metadata_agent container... Oct 14 04:23:57 localhost systemd[1]: Started ovn_metadata_agent container. Oct 14 04:23:58 localhost python3[73231]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:23:59 localhost python3[73352]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005486733 step=4 update_config_hash_only=False Oct 14 04:24:00 localhost python3[73368]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:24:00 localhost python3[73384]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Oct 14 04:24:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:24:02 localhost podman[73386]: 2025-10-14 08:24:02.722809877 +0000 UTC m=+0.065548274 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-collectd, version=17.1.9, managed_by=tripleo_ansible, release=2, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:24:02 localhost podman[73386]: 2025-10-14 08:24:02.732921738 +0000 UTC m=+0.075660235 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, tcib_managed=true, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:24:02 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:24:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:24:04 localhost systemd[1]: tmp-crun.hHtjyU.mount: Deactivated successfully. Oct 14 04:24:04 localhost podman[73407]: 2025-10-14 08:24:04.732498769 +0000 UTC m=+0.075966155 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-iscsid, config_id=tripleo_step3) Oct 14 04:24:04 localhost podman[73407]: 2025-10-14 08:24:04.739848354 +0000 UTC m=+0.083315750 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20250721.1, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, container_name=iscsid) Oct 14 04:24:04 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:24:09 localhost systemd[1]: tmp-crun.fi5mqF.mount: Deactivated successfully. Oct 14 04:24:09 localhost podman[73426]: 2025-10-14 08:24:09.740772727 +0000 UTC m=+0.088834980 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, release=1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:24:09 localhost podman[73427]: 2025-10-14 08:24:09.800720508 +0000 UTC m=+0.142393195 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Oct 14 04:24:09 localhost podman[73427]: 2025-10-14 08:24:09.837114936 +0000 UTC m=+0.178787633 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.9, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 04:24:09 localhost podman[73431]: 2025-10-14 08:24:09.850434105 +0000 UTC m=+0.189010807 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible) Oct 14 04:24:09 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:24:09 localhost podman[73426]: 2025-10-14 08:24:09.871190883 +0000 UTC m=+0.219253216 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, release=1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:24:09 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:24:09 localhost podman[73479]: 2025-10-14 08:24:09.932272409 +0000 UTC m=+0.136723731 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=nova_migration_target, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:24:09 localhost podman[73431]: 2025-10-14 08:24:09.955227404 +0000 UTC m=+0.293804106 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, release=1, io.buildah.version=1.33.12, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T14:45:33) Oct 14 04:24:09 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:24:10 localhost podman[73479]: 2025-10-14 08:24:10.295123764 +0000 UTC m=+0.499575106 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute) Oct 14 04:24:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:24:10 localhost systemd[1]: tmp-crun.ma5z3B.mount: Deactivated successfully. Oct 14 04:24:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:24:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:24:13 localhost podman[73521]: 2025-10-14 08:24:13.744850908 +0000 UTC m=+0.087271611 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:24:13 localhost podman[73521]: 2025-10-14 08:24:13.787150748 +0000 UTC m=+0.129571441 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:24:13 localhost podman[73522]: 2025-10-14 08:24:13.798726303 +0000 UTC m=+0.137586827 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, container_name=ovn_controller, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:24:13 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:24:13 localhost podman[73522]: 2025-10-14 08:24:13.814357054 +0000 UTC m=+0.153217598 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, vcs-type=git) Oct 14 04:24:13 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:24:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:24:16 localhost podman[73569]: 2025-10-14 08:24:16.737941996 +0000 UTC m=+0.084479885 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, batch=17.1_20250721.1) Oct 14 04:24:16 localhost podman[73569]: 2025-10-14 08:24:16.930073888 +0000 UTC m=+0.276611777 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1) Oct 14 04:24:16 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:24:22 localhost snmpd[68005]: empty variable list in _query Oct 14 04:24:22 localhost snmpd[68005]: empty variable list in _query Oct 14 04:24:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:24:33 localhost podman[73676]: 2025-10-14 08:24:33.748285825 +0000 UTC m=+0.087550291 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, architecture=x86_64, container_name=collectd, release=2, build-date=2025-07-21T13:04:03, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.) Oct 14 04:24:33 localhost podman[73676]: 2025-10-14 08:24:33.787200244 +0000 UTC m=+0.126464680 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:24:33 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:24:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:24:35 localhost podman[73697]: 2025-10-14 08:24:35.760611556 +0000 UTC m=+0.092443640 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:24:35 localhost podman[73697]: 2025-10-14 08:24:35.796607847 +0000 UTC m=+0.128439931 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team) Oct 14 04:24:35 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:24:40 localhost podman[73718]: 2025-10-14 08:24:40.750050587 +0000 UTC m=+0.084699646 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, vcs-type=git, distribution-scope=public, release=1) Oct 14 04:24:40 localhost podman[73718]: 2025-10-14 08:24:40.756537453 +0000 UTC m=+0.091186472 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:24:40 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:24:40 localhost podman[73717]: 2025-10-14 08:24:40.807027082 +0000 UTC m=+0.143608550 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=) Oct 14 04:24:40 localhost podman[73716]: 2025-10-14 08:24:40.712479888 +0000 UTC m=+0.055214272 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:24:40 localhost podman[73721]: 2025-10-14 08:24:40.867868065 +0000 UTC m=+0.198688068 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:45:33, vcs-type=git, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:24:40 localhost podman[73716]: 2025-10-14 08:24:40.892108218 +0000 UTC m=+0.234842652 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, release=1, batch=17.1_20250721.1) Oct 14 04:24:40 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:24:40 localhost podman[73721]: 2025-10-14 08:24:40.948047432 +0000 UTC m=+0.278867495 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:24:40 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:24:41 localhost podman[73717]: 2025-10-14 08:24:41.182857023 +0000 UTC m=+0.519438481 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-compute, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:24:41 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:24:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:24:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:24:44 localhost systemd[1]: tmp-crun.aHRKw7.mount: Deactivated successfully. Oct 14 04:24:44 localhost podman[73810]: 2025-10-14 08:24:44.742551547 +0000 UTC m=+0.085899003 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:24:44 localhost podman[73810]: 2025-10-14 08:24:44.790302002 +0000 UTC m=+0.133649438 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 04:24:44 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:24:44 localhost podman[73811]: 2025-10-14 08:24:44.791414296 +0000 UTC m=+0.131786571 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, release=1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:24:44 localhost podman[73811]: 2025-10-14 08:24:44.871035507 +0000 UTC m=+0.211407792 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:24:44 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:24:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:24:47 localhost podman[73858]: 2025-10-14 08:24:47.743098635 +0000 UTC m=+0.088514502 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, release=1, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1) Oct 14 04:24:47 localhost podman[73858]: 2025-10-14 08:24:47.963368755 +0000 UTC m=+0.308784612 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible) Oct 14 04:24:47 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:25:04 localhost podman[73888]: 2025-10-14 08:25:04.703936045 +0000 UTC m=+0.052296954 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, release=2, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, container_name=collectd) Oct 14 04:25:04 localhost podman[73888]: 2025-10-14 08:25:04.717111824 +0000 UTC m=+0.065472773 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:25:04 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:25:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:25:06 localhost podman[73908]: 2025-10-14 08:25:06.730981852 +0000 UTC m=+0.068984500 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc.) Oct 14 04:25:06 localhost podman[73908]: 2025-10-14 08:25:06.739621714 +0000 UTC m=+0.077624352 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, name=rhosp17/openstack-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, container_name=iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64) Oct 14 04:25:06 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:25:11 localhost systemd[1]: tmp-crun.yQ4KYg.mount: Deactivated successfully. Oct 14 04:25:11 localhost podman[73926]: 2025-10-14 08:25:11.763158826 +0000 UTC m=+0.104504235 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T15:29:47, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.9, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:25:11 localhost podman[73926]: 2025-10-14 08:25:11.798098315 +0000 UTC m=+0.139443764 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:25:11 localhost podman[73927]: 2025-10-14 08:25:11.807617233 +0000 UTC m=+0.143506806 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:25:11 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:25:11 localhost podman[73928]: 2025-10-14 08:25:11.841006694 +0000 UTC m=+0.174278538 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, architecture=x86_64) Oct 14 04:25:11 localhost podman[73928]: 2025-10-14 08:25:11.854220064 +0000 UTC m=+0.187491948 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 04:25:11 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:25:11 localhost podman[73929]: 2025-10-14 08:25:11.899788674 +0000 UTC m=+0.231380997 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Oct 14 04:25:11 localhost podman[73929]: 2025-10-14 08:25:11.95612997 +0000 UTC m=+0.287722243 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute) Oct 14 04:25:11 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:25:12 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:25:12 localhost podman[73927]: 2025-10-14 08:25:12.236236433 +0000 UTC m=+0.572125966 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Oct 14 04:25:12 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:25:12 localhost recover_tripleo_nova_virtqemud[74021]: 62551 Oct 14 04:25:12 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:25:12 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:25:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:25:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:25:15 localhost podman[74023]: 2025-10-14 08:25:15.73036982 +0000 UTC m=+0.070525807 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:25:15 localhost podman[74022]: 2025-10-14 08:25:15.788775878 +0000 UTC m=+0.129827642 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:25:15 localhost podman[74023]: 2025-10-14 08:25:15.841775233 +0000 UTC m=+0.181931220 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 04:25:15 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:25:15 localhost podman[74022]: 2025-10-14 08:25:15.862232893 +0000 UTC m=+0.203284707 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20250721.1, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T16:28:53, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Oct 14 04:25:15 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:25:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:25:18 localhost podman[74070]: 2025-10-14 08:25:18.759970478 +0000 UTC m=+0.095432871 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, distribution-scope=public, version=17.1.9, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, container_name=metrics_qdr, vcs-type=git) Oct 14 04:25:18 localhost podman[74070]: 2025-10-14 08:25:18.996647395 +0000 UTC m=+0.332109778 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:25:19 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:25:30 localhost systemd[1]: tmp-crun.cuiGVv.mount: Deactivated successfully. Oct 14 04:25:30 localhost podman[74201]: 2025-10-14 08:25:30.054919494 +0000 UTC m=+0.082754248 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55) Oct 14 04:25:30 localhost podman[74201]: 2025-10-14 08:25:30.171704311 +0000 UTC m=+0.199539055 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 04:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:25:35 localhost systemd[1]: tmp-crun.xGi3WE.mount: Deactivated successfully. Oct 14 04:25:35 localhost podman[74343]: 2025-10-14 08:25:35.737476774 +0000 UTC m=+0.082102057 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=collectd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1) Oct 14 04:25:35 localhost podman[74343]: 2025-10-14 08:25:35.772400292 +0000 UTC m=+0.117025555 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=collectd, vcs-type=git, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., batch=17.1_20250721.1, release=2, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:25:35 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:25:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:25:37 localhost podman[74365]: 2025-10-14 08:25:37.710889828 +0000 UTC m=+0.056902755 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, container_name=iscsid, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, config_id=tripleo_step3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team) Oct 14 04:25:37 localhost podman[74365]: 2025-10-14 08:25:37.722052576 +0000 UTC m=+0.068065473 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, name=rhosp17/openstack-iscsid) Oct 14 04:25:37 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:25:42 localhost systemd[1]: tmp-crun.fQQ6Ut.mount: Deactivated successfully. Oct 14 04:25:42 localhost podman[74387]: 2025-10-14 08:25:42.747035173 +0000 UTC m=+0.077874640 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Oct 14 04:25:42 localhost podman[74387]: 2025-10-14 08:25:42.759082637 +0000 UTC m=+0.089922084 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-cron-container, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, io.openshift.expose-services=) Oct 14 04:25:42 localhost podman[74388]: 2025-10-14 08:25:42.805784681 +0000 UTC m=+0.136833054 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, io.openshift.expose-services=, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:25:42 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:25:42 localhost podman[74388]: 2025-10-14 08:25:42.840276226 +0000 UTC m=+0.171324639 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9) Oct 14 04:25:42 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:25:42 localhost podman[74386]: 2025-10-14 08:25:42.907305696 +0000 UTC m=+0.240147644 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, container_name=nova_migration_target, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:25:42 localhost podman[74385]: 2025-10-14 08:25:42.949795463 +0000 UTC m=+0.287728065 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi) Oct 14 04:25:42 localhost podman[74385]: 2025-10-14 08:25:42.981459401 +0000 UTC m=+0.319392103 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true) Oct 14 04:25:42 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:25:43 localhost podman[74386]: 2025-10-14 08:25:43.259407798 +0000 UTC m=+0.592249756 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, release=1, vcs-type=git, version=17.1.9, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 04:25:43 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:25:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:25:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:25:46 localhost systemd[1]: tmp-crun.8z9HOt.mount: Deactivated successfully. Oct 14 04:25:46 localhost podman[74482]: 2025-10-14 08:25:46.757193966 +0000 UTC m=+0.095416600 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:25:46 localhost podman[74481]: 2025-10-14 08:25:46.804946182 +0000 UTC m=+0.144653001 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, tcib_managed=true, io.openshift.expose-services=) Oct 14 04:25:46 localhost podman[74482]: 2025-10-14 08:25:46.828966929 +0000 UTC m=+0.167189563 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, distribution-scope=public, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64) Oct 14 04:25:46 localhost podman[74481]: 2025-10-14 08:25:46.837854758 +0000 UTC m=+0.177561517 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:25:46 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:25:46 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:25:47 localhost systemd[1]: tmp-crun.nv2oVY.mount: Deactivated successfully. Oct 14 04:25:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:25:49 localhost podman[74529]: 2025-10-14 08:25:49.75199645 +0000 UTC m=+0.096118142 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, release=1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:25:50 localhost podman[74529]: 2025-10-14 08:25:50.047095526 +0000 UTC m=+0.391217138 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, batch=17.1_20250721.1, tcib_managed=true, distribution-scope=public, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:25:50 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:26:06 localhost podman[74559]: 2025-10-14 08:26:06.717636706 +0000 UTC m=+0.063505883 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, version=17.1.9, container_name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Oct 14 04:26:06 localhost podman[74559]: 2025-10-14 08:26:06.731238338 +0000 UTC m=+0.077107495 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, container_name=collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=2) Oct 14 04:26:06 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:26:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:26:08 localhost podman[74579]: 2025-10-14 08:26:08.727142603 +0000 UTC m=+0.072709513 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, io.openshift.expose-services=) Oct 14 04:26:08 localhost podman[74579]: 2025-10-14 08:26:08.733854606 +0000 UTC m=+0.079421526 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, build-date=2025-07-21T13:27:15, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12) Oct 14 04:26:08 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:26:13 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:26:13 localhost recover_tripleo_nova_virtqemud[74622]: 62551 Oct 14 04:26:13 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:26:13 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:26:13 localhost podman[74600]: 2025-10-14 08:26:13.733592098 +0000 UTC m=+0.080702755 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64) Oct 14 04:26:13 localhost podman[74600]: 2025-10-14 08:26:13.75907932 +0000 UTC m=+0.106190017 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, architecture=x86_64, release=1) Oct 14 04:26:13 localhost podman[74601]: 2025-10-14 08:26:13.770077103 +0000 UTC m=+0.111587090 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:26:13 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:26:13 localhost podman[74602]: 2025-10-14 08:26:13.848146147 +0000 UTC m=+0.187204730 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, container_name=logrotate_crond, release=1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:26:13 localhost podman[74602]: 2025-10-14 08:26:13.85520282 +0000 UTC m=+0.194261403 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, name=rhosp17/openstack-cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, architecture=x86_64) Oct 14 04:26:13 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:26:13 localhost podman[74608]: 2025-10-14 08:26:13.952551559 +0000 UTC m=+0.287136067 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:26:13 localhost podman[74608]: 2025-10-14 08:26:13.9879519 +0000 UTC m=+0.322536398 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team) Oct 14 04:26:13 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:26:14 localhost podman[74601]: 2025-10-14 08:26:14.163128616 +0000 UTC m=+0.504638643 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, version=17.1.9, container_name=nova_migration_target, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public) Oct 14 04:26:14 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:26:14 localhost systemd[1]: tmp-crun.EoOztF.mount: Deactivated successfully. Oct 14 04:26:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:26:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:26:17 localhost podman[74697]: 2025-10-14 08:26:17.729134979 +0000 UTC m=+0.070564057 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, container_name=ovn_controller, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc.) Oct 14 04:26:17 localhost podman[74697]: 2025-10-14 08:26:17.754157387 +0000 UTC m=+0.095586495 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20250721.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller) Oct 14 04:26:17 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:26:17 localhost podman[74696]: 2025-10-14 08:26:17.884272786 +0000 UTC m=+0.229008664 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:26:17 localhost podman[74696]: 2025-10-14 08:26:17.923871105 +0000 UTC m=+0.268607003 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:26:17 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:26:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:26:20 localhost podman[74742]: 2025-10-14 08:26:20.707398902 +0000 UTC m=+0.053068408 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:26:20 localhost podman[74742]: 2025-10-14 08:26:20.89820658 +0000 UTC m=+0.243876086 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, batch=17.1_20250721.1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:26:20 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:26:37 localhost systemd[1]: tmp-crun.noslOL.mount: Deactivated successfully. Oct 14 04:26:37 localhost podman[74849]: 2025-10-14 08:26:37.738284154 +0000 UTC m=+0.082548020 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., release=2, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1) Oct 14 04:26:37 localhost podman[74849]: 2025-10-14 08:26:37.754217837 +0000 UTC m=+0.098481693 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:26:37 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:26:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:26:39 localhost podman[74869]: 2025-10-14 08:26:39.729331381 +0000 UTC m=+0.074905120 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, container_name=iscsid) Oct 14 04:26:39 localhost podman[74869]: 2025-10-14 08:26:39.741022755 +0000 UTC m=+0.086596504 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.9, distribution-scope=public) Oct 14 04:26:39 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:26:44 localhost systemd[1]: tmp-crun.8YRcL1.mount: Deactivated successfully. Oct 14 04:26:44 localhost podman[74888]: 2025-10-14 08:26:44.74578693 +0000 UTC m=+0.080663084 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, io.openshift.expose-services=, release=1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:26:44 localhost podman[74889]: 2025-10-14 08:26:44.802532338 +0000 UTC m=+0.133174824 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:26:44 localhost podman[74896]: 2025-10-14 08:26:44.774126568 +0000 UTC m=+0.098179165 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, architecture=x86_64, version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc.) Oct 14 04:26:44 localhost podman[74890]: 2025-10-14 08:26:44.877830028 +0000 UTC m=+0.202595216 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, name=rhosp17/openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc.) Oct 14 04:26:44 localhost podman[74890]: 2025-10-14 08:26:44.886011816 +0000 UTC m=+0.210777034 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1, container_name=logrotate_crond, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4) Oct 14 04:26:44 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:26:44 localhost podman[74896]: 2025-10-14 08:26:44.904527857 +0000 UTC m=+0.228580454 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Oct 14 04:26:44 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:26:44 localhost podman[74888]: 2025-10-14 08:26:44.955299155 +0000 UTC m=+0.290175229 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, release=1, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 04:26:44 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:26:45 localhost podman[74889]: 2025-10-14 08:26:45.149111434 +0000 UTC m=+0.479753950 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, distribution-scope=public, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Oct 14 04:26:45 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:26:45 localhost systemd[1]: tmp-crun.sS7DJc.mount: Deactivated successfully. Oct 14 04:26:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:26:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:26:48 localhost systemd[1]: tmp-crun.eOgJRK.mount: Deactivated successfully. Oct 14 04:26:48 localhost podman[74982]: 2025-10-14 08:26:48.761073389 +0000 UTC m=+0.096738250 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vcs-type=git, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, tcib_managed=true) Oct 14 04:26:48 localhost podman[74981]: 2025-10-14 08:26:48.729757981 +0000 UTC m=+0.075124516 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Oct 14 04:26:48 localhost podman[74982]: 2025-10-14 08:26:48.792086787 +0000 UTC m=+0.127751618 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true) Oct 14 04:26:48 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:26:48 localhost podman[74981]: 2025-10-14 08:26:48.813098134 +0000 UTC m=+0.158464669 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T16:28:53, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-type=git) Oct 14 04:26:48 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:26:49 localhost python3[75077]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:26:49 localhost systemd[1]: tmp-crun.mcZvL5.mount: Deactivated successfully. Oct 14 04:26:50 localhost python3[75122]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430409.2147539-114292-275955096537729/source _original_basename=tmpk5utn8ff follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:26:50 localhost python3[75152]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:26:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:26:51 localhost podman[75203]: 2025-10-14 08:26:51.55500417 +0000 UTC m=+0.071481496 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.expose-services=, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Oct 14 04:26:51 localhost podman[75203]: 2025-10-14 08:26:51.780378585 +0000 UTC m=+0.296856011 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:26:51 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:26:52 localhost ansible-async_wrapper.py[75353]: Invoked with 269928068175 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430412.145414-114422-146044210958842/AnsiballZ_command.py _ Oct 14 04:26:52 localhost ansible-async_wrapper.py[75356]: Starting module and watcher Oct 14 04:26:52 localhost ansible-async_wrapper.py[75356]: Start watching 75357 (3600) Oct 14 04:26:52 localhost ansible-async_wrapper.py[75357]: Start module (75357) Oct 14 04:26:52 localhost ansible-async_wrapper.py[75353]: Return async_wrapper task started. Oct 14 04:26:52 localhost python3[75377]: ansible-ansible.legacy.async_status Invoked with jid=269928068175.75353 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:26:56 localhost puppet-user[75361]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Oct 14 04:26:56 localhost puppet-user[75361]: (file: /etc/puppet/hiera.yaml) Oct 14 04:26:56 localhost puppet-user[75361]: Warning: Undefined variable '::deploy_config_name'; Oct 14 04:26:56 localhost puppet-user[75361]: (file & line not available) Oct 14 04:26:56 localhost puppet-user[75361]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Oct 14 04:26:56 localhost puppet-user[75361]: (file & line not available) Oct 14 04:26:56 localhost puppet-user[75361]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Oct 14 04:26:56 localhost puppet-user[75361]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:26:56 localhost puppet-user[75361]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:26:56 localhost puppet-user[75361]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:26:56 localhost puppet-user[75361]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:26:56 localhost puppet-user[75361]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:26:56 localhost puppet-user[75361]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:26:56 localhost puppet-user[75361]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:26:56 localhost puppet-user[75361]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:26:56 localhost puppet-user[75361]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:26:56 localhost puppet-user[75361]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:26:56 localhost puppet-user[75361]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:26:56 localhost puppet-user[75361]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:26:56 localhost puppet-user[75361]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:26:56 localhost puppet-user[75361]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:26:56 localhost puppet-user[75361]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Oct 14 04:26:56 localhost puppet-user[75361]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Oct 14 04:26:56 localhost puppet-user[75361]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Oct 14 04:26:56 localhost puppet-user[75361]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Oct 14 04:26:56 localhost puppet-user[75361]: Notice: Compiled catalog for np0005486733.localdomain in environment production in 0.25 seconds Oct 14 04:26:57 localhost puppet-user[75361]: Notice: Applied catalog in 0.29 seconds Oct 14 04:26:57 localhost puppet-user[75361]: Application: Oct 14 04:26:57 localhost puppet-user[75361]: Initial environment: production Oct 14 04:26:57 localhost puppet-user[75361]: Converged environment: production Oct 14 04:26:57 localhost puppet-user[75361]: Run mode: user Oct 14 04:26:57 localhost puppet-user[75361]: Changes: Oct 14 04:26:57 localhost puppet-user[75361]: Events: Oct 14 04:26:57 localhost puppet-user[75361]: Resources: Oct 14 04:26:57 localhost puppet-user[75361]: Total: 19 Oct 14 04:26:57 localhost puppet-user[75361]: Time: Oct 14 04:26:57 localhost puppet-user[75361]: Filebucket: 0.00 Oct 14 04:26:57 localhost puppet-user[75361]: Schedule: 0.00 Oct 14 04:26:57 localhost puppet-user[75361]: Package: 0.00 Oct 14 04:26:57 localhost puppet-user[75361]: Exec: 0.01 Oct 14 04:26:57 localhost puppet-user[75361]: Augeas: 0.01 Oct 14 04:26:57 localhost puppet-user[75361]: File: 0.04 Oct 14 04:26:57 localhost puppet-user[75361]: Service: 0.07 Oct 14 04:26:57 localhost puppet-user[75361]: Transaction evaluation: 0.28 Oct 14 04:26:57 localhost puppet-user[75361]: Catalog application: 0.29 Oct 14 04:26:57 localhost puppet-user[75361]: Config retrieval: 0.31 Oct 14 04:26:57 localhost puppet-user[75361]: Last run: 1760430417 Oct 14 04:26:57 localhost puppet-user[75361]: Total: 0.29 Oct 14 04:26:57 localhost puppet-user[75361]: Version: Oct 14 04:26:57 localhost puppet-user[75361]: Config: 1760430416 Oct 14 04:26:57 localhost puppet-user[75361]: Puppet: 7.10.0 Oct 14 04:26:57 localhost ansible-async_wrapper.py[75357]: Module complete (75357) Oct 14 04:26:57 localhost ansible-async_wrapper.py[75356]: Done in kid B. Oct 14 04:27:03 localhost python3[75515]: ansible-ansible.legacy.async_status Invoked with jid=269928068175.75353 mode=status _async_dir=/tmp/.ansible_async Oct 14 04:27:03 localhost python3[75531]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:27:04 localhost python3[75547]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:27:04 localhost python3[75597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:05 localhost python3[75615]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpf5binfb2 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Oct 14 04:27:05 localhost python3[75645]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:06 localhost python3[75750]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Oct 14 04:27:07 localhost python3[75769]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:27:08 localhost podman[75802]: 2025-10-14 08:27:08.310893254 +0000 UTC m=+0.093631757 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=2, architecture=x86_64, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 04:27:08 localhost python3[75801]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:27:08 localhost podman[75802]: 2025-10-14 08:27:08.352081032 +0000 UTC m=+0.134819565 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, release=2, tcib_managed=true, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, distribution-scope=public) Oct 14 04:27:08 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:27:08 localhost python3[75870]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:09 localhost python3[75888]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:09 localhost python3[75950]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:27:09 localhost podman[75968]: 2025-10-14 08:27:09.929128861 +0000 UTC m=+0.085010375 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Oct 14 04:27:09 localhost podman[75968]: 2025-10-14 08:27:09.968122662 +0000 UTC m=+0.124004156 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, release=1, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-iscsid-container, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step3) Oct 14 04:27:09 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:27:10 localhost python3[75969]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:10 localhost python3[76049]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:10 localhost python3[76067]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:11 localhost python3[76129]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:11 localhost python3[76147]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:12 localhost python3[76177]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:27:12 localhost systemd[1]: Reloading. Oct 14 04:27:12 localhost systemd-sysv-generator[76208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:27:12 localhost systemd-rc-local-generator[76204]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:27:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:27:13 localhost python3[76263]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:13 localhost python3[76281]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:13 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:27:13 localhost recover_tripleo_nova_virtqemud[76345]: 62551 Oct 14 04:27:13 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:27:13 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:27:13 localhost python3[76343]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Oct 14 04:27:14 localhost python3[76363]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:27:14 localhost python3[76393]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:27:14 localhost systemd[1]: Reloading. Oct 14 04:27:14 localhost systemd-rc-local-generator[76414]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:27:14 localhost systemd-sysv-generator[76421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:27:15 localhost systemd[1]: Starting Create netns directory... Oct 14 04:27:15 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 04:27:15 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 04:27:15 localhost systemd[1]: Finished Create netns directory. Oct 14 04:27:15 localhost systemd[1]: tmp-crun.fsn1l7.mount: Deactivated successfully. Oct 14 04:27:15 localhost podman[76431]: 2025-10-14 08:27:15.283964747 +0000 UTC m=+0.098566696 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20250721.1) Oct 14 04:27:15 localhost podman[76434]: 2025-10-14 08:27:15.325969199 +0000 UTC m=+0.136529046 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:27:15 localhost podman[76431]: 2025-10-14 08:27:15.341048385 +0000 UTC m=+0.155650324 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:27:15 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:27:15 localhost podman[76433]: 2025-10-14 08:27:15.373792917 +0000 UTC m=+0.188892061 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, container_name=logrotate_crond, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4) Oct 14 04:27:15 localhost podman[76433]: 2025-10-14 08:27:15.385059968 +0000 UTC m=+0.200159112 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:27:15 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:27:15 localhost podman[76434]: 2025-10-14 08:27:15.396484754 +0000 UTC m=+0.207044581 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute) Oct 14 04:27:15 localhost podman[76432]: 2025-10-14 08:27:15.303915871 +0000 UTC m=+0.117071626 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, container_name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute) Oct 14 04:27:15 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:27:15 localhost podman[76432]: 2025-10-14 08:27:15.647158826 +0000 UTC m=+0.460314581 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_migration_target, io.buildah.version=1.33.12, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:27:15 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:27:15 localhost python3[76541]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Oct 14 04:27:17 localhost python3[76600]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Oct 14 04:27:18 localhost podman[76639]: 2025-10-14 08:27:18.096081339 +0000 UTC m=+0.086482200 container create ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, managed_by=tripleo_ansible, container_name=nova_compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:27:18 localhost podman[76639]: 2025-10-14 08:27:18.048829178 +0000 UTC m=+0.039230069 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:27:18 localhost systemd[1]: Started libpod-conmon-ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.scope. Oct 14 04:27:18 localhost systemd[1]: Started libcrun container. Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76cf0246342f1e521e8960667a4518c270830e003f626a6d56414187630bdbfc/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76cf0246342f1e521e8960667a4518c270830e003f626a6d56414187630bdbfc/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76cf0246342f1e521e8960667a4518c270830e003f626a6d56414187630bdbfc/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76cf0246342f1e521e8960667a4518c270830e003f626a6d56414187630bdbfc/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76cf0246342f1e521e8960667a4518c270830e003f626a6d56414187630bdbfc/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:27:18 localhost podman[76639]: 2025-10-14 08:27:18.208739421 +0000 UTC m=+0.199140262 container init ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, container_name=nova_compute, vcs-type=git, version=17.1.9, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, config_id=tripleo_step5) Oct 14 04:27:18 localhost systemd[1]: tmp-crun.qJ3Qcl.mount: Deactivated successfully. Oct 14 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:27:18 localhost podman[76639]: 2025-10-14 08:27:18.253485026 +0000 UTC m=+0.243885877 container start ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, build-date=2025-07-21T14:48:37, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:27:18 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:27:18 localhost python3[76600]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:27:18 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 04:27:18 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 04:27:18 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 04:27:18 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 04:27:18 localhost podman[76659]: 2025-10-14 08:27:18.4005874 +0000 UTC m=+0.135346169 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, release=1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.expose-services=, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12) Oct 14 04:27:18 localhost podman[76659]: 2025-10-14 08:27:18.487173693 +0000 UTC m=+0.221932532 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9) Oct 14 04:27:18 localhost podman[76659]: unhealthy Oct 14 04:27:18 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:27:18 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 04:27:18 localhost systemd[76672]: Queued start job for default target Main User Target. Oct 14 04:27:18 localhost systemd[76672]: Created slice User Application Slice. Oct 14 04:27:18 localhost systemd[76672]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 04:27:18 localhost systemd[76672]: Started Daily Cleanup of User's Temporary Directories. Oct 14 04:27:18 localhost systemd[76672]: Reached target Paths. Oct 14 04:27:18 localhost systemd[76672]: Reached target Timers. Oct 14 04:27:18 localhost systemd[76672]: Starting D-Bus User Message Bus Socket... Oct 14 04:27:18 localhost systemd[76672]: Starting Create User's Volatile Files and Directories... Oct 14 04:27:18 localhost systemd[76672]: Listening on D-Bus User Message Bus Socket. Oct 14 04:27:18 localhost systemd[76672]: Reached target Sockets. Oct 14 04:27:18 localhost systemd[76672]: Finished Create User's Volatile Files and Directories. Oct 14 04:27:18 localhost systemd[76672]: Reached target Basic System. Oct 14 04:27:18 localhost systemd[76672]: Reached target Main User Target. Oct 14 04:27:18 localhost systemd[76672]: Startup finished in 179ms. Oct 14 04:27:18 localhost systemd[1]: Started User Manager for UID 0. Oct 14 04:27:18 localhost systemd[1]: Started Session c10 of User root. Oct 14 04:27:18 localhost systemd[1]: session-c10.scope: Deactivated successfully. Oct 14 04:27:18 localhost podman[76759]: 2025-10-14 08:27:18.69772903 +0000 UTC m=+0.080985644 container create 1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, container_name=nova_wait_for_compute_service, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:27:18 localhost systemd[1]: Started libpod-conmon-1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c.scope. Oct 14 04:27:18 localhost systemd[1]: Started libcrun container. Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9742bc003ef67d7b0fcfbbfc43cb220e80e11c29322c24576aca8c567b6093e5/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost podman[76759]: 2025-10-14 08:27:18.658201772 +0000 UTC m=+0.041458486 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:27:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9742bc003ef67d7b0fcfbbfc43cb220e80e11c29322c24576aca8c567b6093e5/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Oct 14 04:27:18 localhost podman[76759]: 2025-10-14 08:27:18.769929576 +0000 UTC m=+0.153186210 container init 1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, distribution-scope=public, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1) Oct 14 04:27:18 localhost podman[76759]: 2025-10-14 08:27:18.778191836 +0000 UTC m=+0.161448470 container start 1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, container_name=nova_wait_for_compute_service, architecture=x86_64, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:27:18 localhost podman[76759]: 2025-10-14 08:27:18.778522296 +0000 UTC m=+0.161778940 container attach 1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, version=17.1.9, vcs-type=git) Oct 14 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:27:18 localhost podman[76783]: 2025-10-14 08:27:18.995961791 +0000 UTC m=+0.077430546 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, architecture=x86_64) Oct 14 04:27:19 localhost podman[76784]: 2025-10-14 08:27:19.059599809 +0000 UTC m=+0.136878627 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, build-date=2025-07-21T13:28:44, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:27:19 localhost podman[76783]: 2025-10-14 08:27:19.072085266 +0000 UTC m=+0.153553971 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20250721.1, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=) Oct 14 04:27:19 localhost podman[76784]: 2025-10-14 08:27:19.085340518 +0000 UTC m=+0.162619326 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, version=17.1.9, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12) Oct 14 04:27:19 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:27:19 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:27:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:27:22 localhost systemd[1]: tmp-crun.VkN9lS.mount: Deactivated successfully. Oct 14 04:27:22 localhost podman[76831]: 2025-10-14 08:27:22.759773624 +0000 UTC m=+0.100635538 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, name=rhosp17/openstack-qdrouterd, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:27:22 localhost podman[76831]: 2025-10-14 08:27:22.956134901 +0000 UTC m=+0.296996815 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, container_name=metrics_qdr, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team) Oct 14 04:27:22 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:27:28 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 04:27:28 localhost systemd[76672]: Activating special unit Exit the Session... Oct 14 04:27:28 localhost systemd[76672]: Stopped target Main User Target. Oct 14 04:27:28 localhost systemd[76672]: Stopped target Basic System. Oct 14 04:27:28 localhost systemd[76672]: Stopped target Paths. Oct 14 04:27:28 localhost systemd[76672]: Stopped target Sockets. Oct 14 04:27:28 localhost systemd[76672]: Stopped target Timers. Oct 14 04:27:28 localhost systemd[76672]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 04:27:28 localhost systemd[76672]: Closed D-Bus User Message Bus Socket. Oct 14 04:27:28 localhost systemd[76672]: Stopped Create User's Volatile Files and Directories. Oct 14 04:27:28 localhost systemd[76672]: Removed slice User Application Slice. Oct 14 04:27:28 localhost systemd[76672]: Reached target Shutdown. Oct 14 04:27:28 localhost systemd[76672]: Finished Exit the Session. Oct 14 04:27:28 localhost systemd[76672]: Reached target Exit the Session. Oct 14 04:27:28 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 04:27:28 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 04:27:28 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 04:27:28 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 04:27:28 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 04:27:28 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 04:27:28 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 04:27:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:27:38 localhost podman[76937]: 2025-10-14 08:27:38.765278525 +0000 UTC m=+0.102206456 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9) Oct 14 04:27:38 localhost podman[76937]: 2025-10-14 08:27:38.781352472 +0000 UTC m=+0.118280393 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 04:27:38 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:27:40 localhost podman[76957]: 2025-10-14 08:27:40.740838333 +0000 UTC m=+0.079535859 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, version=17.1.9, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 04:27:40 localhost podman[76957]: 2025-10-14 08:27:40.781659149 +0000 UTC m=+0.120356655 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:27:40 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:27:45 localhost systemd[1]: session-27.scope: Deactivated successfully. Oct 14 04:27:45 localhost systemd[1]: session-27.scope: Consumed 3.074s CPU time. Oct 14 04:27:45 localhost systemd-logind[760]: Session 27 logged out. Waiting for processes to exit. Oct 14 04:27:45 localhost systemd-logind[760]: Removed session 27. Oct 14 04:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:27:45 localhost systemd[1]: tmp-crun.2dAWU9.mount: Deactivated successfully. Oct 14 04:27:45 localhost podman[76984]: 2025-10-14 08:27:45.764296503 +0000 UTC m=+0.090534282 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, container_name=nova_migration_target, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true) Oct 14 04:27:45 localhost systemd[1]: tmp-crun.UBkpuP.mount: Deactivated successfully. Oct 14 04:27:45 localhost podman[76978]: 2025-10-14 08:27:45.817938289 +0000 UTC m=+0.150882801 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1) Oct 14 04:27:45 localhost podman[76976]: 2025-10-14 08:27:45.733919454 +0000 UTC m=+0.074438795 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, tcib_managed=true, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4) Oct 14 04:27:45 localhost podman[76976]: 2025-10-14 08:27:45.868091646 +0000 UTC m=+0.208610977 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:27:45 localhost podman[76978]: 2025-10-14 08:27:45.8761292 +0000 UTC m=+0.209073782 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:27:45 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:27:45 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:27:45 localhost podman[76977]: 2025-10-14 08:27:45.948983966 +0000 UTC m=+0.284473985 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, batch=17.1_20250721.1, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:27:45 localhost podman[76977]: 2025-10-14 08:27:45.989131033 +0000 UTC m=+0.324621022 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, distribution-scope=public, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20250721.1, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:27:46 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:27:46 localhost podman[76984]: 2025-10-14 08:27:46.109170767 +0000 UTC m=+0.435408536 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:27:46 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:27:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:27:48 localhost podman[77072]: 2025-10-14 08:27:48.745907369 +0000 UTC m=+0.082890062 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, com.redhat.component=openstack-nova-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12) Oct 14 04:27:48 localhost podman[77072]: 2025-10-14 08:27:48.807214386 +0000 UTC m=+0.144197109 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, architecture=x86_64) Oct 14 04:27:48 localhost podman[77072]: unhealthy Oct 14 04:27:48 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:27:48 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 04:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:27:49 localhost systemd[1]: tmp-crun.uPXn8R.mount: Deactivated successfully. Oct 14 04:27:49 localhost podman[77096]: 2025-10-14 08:27:49.749940805 +0000 UTC m=+0.086999036 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Oct 14 04:27:49 localhost podman[77095]: 2025-10-14 08:27:49.792434361 +0000 UTC m=+0.132508593 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, release=1) Oct 14 04:27:49 localhost podman[77096]: 2025-10-14 08:27:49.798242997 +0000 UTC m=+0.135301208 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 04:27:49 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:27:49 localhost podman[77095]: 2025-10-14 08:27:49.834047052 +0000 UTC m=+0.174121314 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Oct 14 04:27:49 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:27:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:27:53 localhost podman[77145]: 2025-10-14 08:27:53.750700874 +0000 UTC m=+0.088524451 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd) Oct 14 04:27:53 localhost podman[77145]: 2025-10-14 08:27:53.975143021 +0000 UTC m=+0.312966588 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:27:53 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:28:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:28:09 localhost systemd[1]: tmp-crun.hdAW5w.mount: Deactivated successfully. Oct 14 04:28:09 localhost podman[77174]: 2025-10-14 08:28:09.759562356 +0000 UTC m=+0.094674908 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step3, io.buildah.version=1.33.12, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=2, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Oct 14 04:28:09 localhost podman[77174]: 2025-10-14 08:28:09.772930291 +0000 UTC m=+0.108042843 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, release=2, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.33.12) Oct 14 04:28:09 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:28:11 localhost podman[77194]: 2025-10-14 08:28:11.736895618 +0000 UTC m=+0.081765197 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:28:11 localhost podman[77194]: 2025-10-14 08:28:11.775048763 +0000 UTC m=+0.119918302 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, release=1, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Oct 14 04:28:11 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:28:16 localhost podman[77222]: 2025-10-14 08:28:16.738214397 +0000 UTC m=+0.073020942 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute) Oct 14 04:28:16 localhost podman[77222]: 2025-10-14 08:28:16.792008886 +0000 UTC m=+0.126815421 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, release=1, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:28:16 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:28:16 localhost podman[77216]: 2025-10-14 08:28:16.795729169 +0000 UTC m=+0.129783922 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=logrotate_crond, distribution-scope=public) Oct 14 04:28:16 localhost systemd[1]: tmp-crun.iFTfZE.mount: Deactivated successfully. Oct 14 04:28:16 localhost podman[77215]: 2025-10-14 08:28:16.856907281 +0000 UTC m=+0.194613824 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.buildah.version=1.33.12) Oct 14 04:28:16 localhost podman[77216]: 2025-10-14 08:28:16.887123686 +0000 UTC m=+0.221178449 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.buildah.version=1.33.12, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:28:16 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:28:16 localhost podman[77214]: 2025-10-14 08:28:16.908777352 +0000 UTC m=+0.250501697 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi) Oct 14 04:28:16 localhost podman[77214]: 2025-10-14 08:28:16.941222804 +0000 UTC m=+0.282947119 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1) Oct 14 04:28:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:28:17 localhost podman[77215]: 2025-10-14 08:28:17.234082934 +0000 UTC m=+0.571789427 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1, config_id=tripleo_step4, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, maintainer=OpenStack TripleO Team) Oct 14 04:28:17 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:28:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:28:19 localhost systemd[1]: tmp-crun.2dOb2g.mount: Deactivated successfully. Oct 14 04:28:19 localhost podman[77306]: 2025-10-14 08:28:19.736836477 +0000 UTC m=+0.080251711 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, version=17.1.9, release=1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible) Oct 14 04:28:19 localhost podman[77306]: 2025-10-14 08:28:19.803249188 +0000 UTC m=+0.146664472 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:28:19 localhost podman[77306]: unhealthy Oct 14 04:28:19 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:28:19 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 04:28:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:28:19 localhost podman[77330]: 2025-10-14 08:28:19.918038005 +0000 UTC m=+0.079167529 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:28:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:28:19 localhost podman[77330]: 2025-10-14 08:28:19.972268746 +0000 UTC m=+0.133398240 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9) Oct 14 04:28:19 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:28:20 localhost podman[77349]: 2025-10-14 08:28:20.040904316 +0000 UTC m=+0.092814822 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, release=1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:28:20 localhost podman[77349]: 2025-10-14 08:28:20.117023951 +0000 UTC m=+0.168934487 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.9) Oct 14 04:28:20 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:28:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:28:24 localhost podman[77378]: 2025-10-14 08:28:24.739047914 +0000 UTC m=+0.083796408 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public) Oct 14 04:28:24 localhost podman[77378]: 2025-10-14 08:28:24.945973591 +0000 UTC m=+0.290722115 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59) Oct 14 04:28:24 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:28:35 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:28:35 localhost recover_tripleo_nova_virtqemud[77423]: 62551 Oct 14 04:28:35 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:28:35 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:28:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:28:40 localhost podman[77486]: 2025-10-14 08:28:40.745763222 +0000 UTC m=+0.080972533 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:28:40 localhost podman[77486]: 2025-10-14 08:28:40.762063526 +0000 UTC m=+0.097272897 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, build-date=2025-07-21T13:04:03, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1) Oct 14 04:28:40 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:28:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:28:42 localhost podman[77507]: 2025-10-14 08:28:42.720010162 +0000 UTC m=+0.066550384 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, container_name=iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9) Oct 14 04:28:42 localhost podman[77507]: 2025-10-14 08:28:42.758199479 +0000 UTC m=+0.104739711 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15) Oct 14 04:28:42 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:28:47 localhost systemd[1]: tmp-crun.bCgPfa.mount: Deactivated successfully. Oct 14 04:28:47 localhost podman[77528]: 2025-10-14 08:28:47.76299932 +0000 UTC m=+0.096207158 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=) Oct 14 04:28:47 localhost podman[77538]: 2025-10-14 08:28:47.77241356 +0000 UTC m=+0.100485805 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:28:47 localhost podman[77527]: 2025-10-14 08:28:47.833134959 +0000 UTC m=+0.175798918 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:28:47 localhost podman[77528]: 2025-10-14 08:28:47.850513406 +0000 UTC m=+0.183721304 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true) Oct 14 04:28:47 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:28:47 localhost podman[77526]: 2025-10-14 08:28:47.748876769 +0000 UTC m=+0.089415175 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git) Oct 14 04:28:47 localhost podman[77526]: 2025-10-14 08:28:47.938112965 +0000 UTC m=+0.278651361 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:28:47 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:28:47 localhost podman[77538]: 2025-10-14 08:28:47.95672082 +0000 UTC m=+0.284793125 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:28:47 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:28:48 localhost podman[77527]: 2025-10-14 08:28:48.226151516 +0000 UTC m=+0.568815485 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, version=17.1.9, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Oct 14 04:28:48 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:28:48 localhost systemd[1]: tmp-crun.hpUVrc.mount: Deactivated successfully. Oct 14 04:28:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:28:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:28:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:28:50 localhost podman[77622]: 2025-10-14 08:28:50.718430254 +0000 UTC m=+0.063990159 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:28:50 localhost podman[77623]: 2025-10-14 08:28:50.781041739 +0000 UTC m=+0.122722638 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, release=1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 04:28:50 localhost podman[77629]: 2025-10-14 08:28:50.748912992 +0000 UTC m=+0.082912632 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step5, container_name=nova_compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:28:50 localhost podman[77623]: 2025-10-14 08:28:50.808994601 +0000 UTC m=+0.150675490 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:28:50 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:28:50 localhost podman[77629]: 2025-10-14 08:28:50.832731599 +0000 UTC m=+0.166731299 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, release=1, batch=17.1_20250721.1, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:28:50 localhost podman[77629]: unhealthy Oct 14 04:28:50 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:28:50 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 04:28:50 localhost podman[77622]: 2025-10-14 08:28:50.859415723 +0000 UTC m=+0.204975638 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:28:50 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:28:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:28:55 localhost podman[77690]: 2025-10-14 08:28:55.723882674 +0000 UTC m=+0.072607894 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, container_name=metrics_qdr, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:28:55 localhost podman[77690]: 2025-10-14 08:28:55.92721886 +0000 UTC m=+0.275944040 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public) Oct 14 04:28:55 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:29:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:29:11 localhost podman[77720]: 2025-10-14 08:29:11.743562687 +0000 UTC m=+0.086370504 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, container_name=collectd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:29:11 localhost podman[77720]: 2025-10-14 08:29:11.778834968 +0000 UTC m=+0.121642765 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=2, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd) Oct 14 04:29:11 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:29:13 localhost podman[77740]: 2025-10-14 08:29:13.732561824 +0000 UTC m=+0.077956893 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, version=17.1.9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc.) Oct 14 04:29:13 localhost podman[77740]: 2025-10-14 08:29:13.745145809 +0000 UTC m=+0.090540918 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container) Oct 14 04:29:13 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:29:18 localhost systemd[1]: tmp-crun.L2Lzig.mount: Deactivated successfully. Oct 14 04:29:18 localhost podman[77759]: 2025-10-14 08:29:18.747428165 +0000 UTC m=+0.089534819 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, config_id=tripleo_step4, container_name=nova_migration_target) Oct 14 04:29:18 localhost podman[77758]: 2025-10-14 08:29:18.784760886 +0000 UTC m=+0.125600861 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Oct 14 04:29:18 localhost podman[77761]: 2025-10-14 08:29:18.796883718 +0000 UTC m=+0.129227220 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.9, build-date=2025-07-21T14:45:33, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, config_id=tripleo_step4) Oct 14 04:29:18 localhost podman[77760]: 2025-10-14 08:29:18.76673714 +0000 UTC m=+0.102004179 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git) Oct 14 04:29:18 localhost podman[77761]: 2025-10-14 08:29:18.822909773 +0000 UTC m=+0.155253305 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.expose-services=) Oct 14 04:29:18 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:29:18 localhost podman[77758]: 2025-10-14 08:29:18.837057544 +0000 UTC m=+0.177897519 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, release=1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 04:29:18 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:29:18 localhost podman[77760]: 2025-10-14 08:29:18.84997715 +0000 UTC m=+0.185244199 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.33.12) Oct 14 04:29:18 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:29:19 localhost podman[77759]: 2025-10-14 08:29:19.086219446 +0000 UTC m=+0.428326100 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, tcib_managed=true, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:29:19 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:29:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:29:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:29:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:29:21 localhost podman[77852]: 2025-10-14 08:29:21.750289573 +0000 UTC m=+0.086746886 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc.) Oct 14 04:29:21 localhost podman[77852]: 2025-10-14 08:29:21.803732245 +0000 UTC m=+0.140189578 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, release=1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:29:21 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:29:21 localhost podman[77851]: 2025-10-14 08:29:21.84587903 +0000 UTC m=+0.183109136 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:29:21 localhost podman[77853]: 2025-10-14 08:29:21.804546379 +0000 UTC m=+0.137228000 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, release=1, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true) Oct 14 04:29:21 localhost podman[77853]: 2025-10-14 08:29:21.88413288 +0000 UTC m=+0.216814491 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, release=1) Oct 14 04:29:21 localhost podman[77853]: unhealthy Oct 14 04:29:21 localhost podman[77851]: 2025-10-14 08:29:21.893486448 +0000 UTC m=+0.230716554 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=) Oct 14 04:29:21 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:29:21 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 04:29:21 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:29:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:29:26 localhost podman[77922]: 2025-10-14 08:29:26.743229749 +0000 UTC m=+0.084502698 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1, architecture=x86_64, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, managed_by=tripleo_ansible) Oct 14 04:29:26 localhost podman[77922]: 2025-10-14 08:29:26.951205885 +0000 UTC m=+0.292478854 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:29:26 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:29:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:29:42 localhost systemd[1]: tmp-crun.8t1nj7.mount: Deactivated successfully. Oct 14 04:29:42 localhost podman[78029]: 2025-10-14 08:29:42.752488342 +0000 UTC m=+0.089086266 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, batch=17.1_20250721.1) Oct 14 04:29:42 localhost podman[78029]: 2025-10-14 08:29:42.764971334 +0000 UTC m=+0.101569278 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, architecture=x86_64, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.9, batch=17.1_20250721.1) Oct 14 04:29:42 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:29:44 localhost podman[78049]: 2025-10-14 08:29:44.748847078 +0000 UTC m=+0.085053355 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, container_name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:29:44 localhost podman[78049]: 2025-10-14 08:29:44.783907432 +0000 UTC m=+0.120113719 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:29:44 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:29:49 localhost podman[78073]: 2025-10-14 08:29:49.730406676 +0000 UTC m=+0.067090670 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12) Oct 14 04:29:49 localhost systemd[1]: tmp-crun.b08BS7.mount: Deactivated successfully. Oct 14 04:29:49 localhost podman[78072]: 2025-10-14 08:29:49.755088011 +0000 UTC m=+0.091609600 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Oct 14 04:29:49 localhost podman[78072]: 2025-10-14 08:29:49.790049352 +0000 UTC m=+0.126570901 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, vcs-type=git, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:29:49 localhost systemd[1]: tmp-crun.ha17OW.mount: Deactivated successfully. Oct 14 04:29:49 localhost podman[78070]: 2025-10-14 08:29:49.797264338 +0000 UTC m=+0.138718044 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Oct 14 04:29:49 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:29:49 localhost podman[78073]: 2025-10-14 08:29:49.808500152 +0000 UTC m=+0.145184146 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:29:49 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:29:49 localhost podman[78070]: 2025-10-14 08:29:49.826915 +0000 UTC m=+0.168368766 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:29:49 localhost podman[78071]: 2025-10-14 08:29:49.854534503 +0000 UTC m=+0.193250497 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, architecture=x86_64, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Oct 14 04:29:49 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:29:50 localhost podman[78071]: 2025-10-14 08:29:50.239170531 +0000 UTC m=+0.577886575 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true) Oct 14 04:29:50 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:29:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:29:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:29:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:29:52 localhost podman[78164]: 2025-10-14 08:29:52.748069195 +0000 UTC m=+0.087632662 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:29:52 localhost podman[78164]: 2025-10-14 08:29:52.815149133 +0000 UTC m=+0.154712630 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true) Oct 14 04:29:52 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:29:52 localhost podman[78166]: 2025-10-14 08:29:52.864138173 +0000 UTC m=+0.196010241 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 04:29:52 localhost podman[78165]: 2025-10-14 08:29:52.819243875 +0000 UTC m=+0.151033220 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, container_name=ovn_controller, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, version=17.1.9, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:29:52 localhost podman[78165]: 2025-10-14 08:29:52.899959859 +0000 UTC m=+0.231749184 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, container_name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:29:52 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:29:52 localhost podman[78166]: 2025-10-14 08:29:52.946179996 +0000 UTC m=+0.278052024 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:29:52 localhost podman[78166]: unhealthy Oct 14 04:29:52 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:29:52 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 04:29:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:29:57 localhost podman[78235]: 2025-10-14 08:29:57.744330892 +0000 UTC m=+0.086636762 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd) Oct 14 04:29:57 localhost podman[78235]: 2025-10-14 08:29:57.930175368 +0000 UTC m=+0.272481198 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, release=1, build-date=2025-07-21T13:07:59, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Oct 14 04:29:57 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:30:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:30:13 localhost podman[78265]: 2025-10-14 08:30:13.744496323 +0000 UTC m=+0.083705834 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3) Oct 14 04:30:13 localhost podman[78265]: 2025-10-14 08:30:13.758123129 +0000 UTC m=+0.097332680 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Oct 14 04:30:13 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:30:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:30:15 localhost systemd[1]: tmp-crun.jmP9KH.mount: Deactivated successfully. Oct 14 04:30:15 localhost podman[78285]: 2025-10-14 08:30:15.73348265 +0000 UTC m=+0.073671645 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15) Oct 14 04:30:15 localhost podman[78285]: 2025-10-14 08:30:15.743291913 +0000 UTC m=+0.083480898 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 04:30:15 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:30:20 localhost systemd[1]: tmp-crun.kIuqHi.mount: Deactivated successfully. Oct 14 04:30:20 localhost podman[78305]: 2025-10-14 08:30:20.75072059 +0000 UTC m=+0.090082664 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:30:20 localhost podman[78311]: 2025-10-14 08:30:20.804229154 +0000 UTC m=+0.134064574 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.expose-services=, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, release=1) Oct 14 04:30:20 localhost podman[78314]: 2025-10-14 08:30:20.777985843 +0000 UTC m=+0.095638570 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.9, tcib_managed=true, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, vcs-type=git) Oct 14 04:30:20 localhost podman[78305]: 2025-10-14 08:30:20.888423422 +0000 UTC m=+0.227785516 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true) Oct 14 04:30:20 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:30:20 localhost podman[78306]: 2025-10-14 08:30:20.85982444 +0000 UTC m=+0.193058311 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git) Oct 14 04:30:20 localhost podman[78314]: 2025-10-14 08:30:20.914106597 +0000 UTC m=+0.231759354 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, batch=17.1_20250721.1, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible) Oct 14 04:30:20 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:30:20 localhost podman[78311]: 2025-10-14 08:30:20.943509443 +0000 UTC m=+0.273344923 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 04:30:20 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:30:21 localhost podman[78306]: 2025-10-14 08:30:21.264168814 +0000 UTC m=+0.597402605 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.) Oct 14 04:30:21 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:30:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:30:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:30:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:30:23 localhost systemd[1]: tmp-crun.o8T3xd.mount: Deactivated successfully. Oct 14 04:30:23 localhost podman[78463]: 2025-10-14 08:30:23.754813845 +0000 UTC m=+0.092230118 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:30:23 localhost podman[78465]: 2025-10-14 08:30:23.806909546 +0000 UTC m=+0.140032412 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc.) Oct 14 04:30:23 localhost podman[78465]: 2025-10-14 08:30:23.866869663 +0000 UTC m=+0.199992499 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:30:23 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:30:23 localhost podman[78463]: 2025-10-14 08:30:23.888657531 +0000 UTC m=+0.226073794 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, release=1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, config_id=tripleo_step4) Oct 14 04:30:23 localhost podman[78464]: 2025-10-14 08:30:23.867734228 +0000 UTC m=+0.202276196 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc.) Oct 14 04:30:23 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:30:23 localhost podman[78464]: 2025-10-14 08:30:23.951263597 +0000 UTC m=+0.285805605 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, container_name=ovn_controller, release=1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Oct 14 04:30:23 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:30:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:30:28 localhost podman[78559]: 2025-10-14 08:30:28.737600118 +0000 UTC m=+0.082410315 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9) Oct 14 04:30:28 localhost podman[78559]: 2025-10-14 08:30:28.938803992 +0000 UTC m=+0.283614139 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:30:28 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:30:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:30:30 localhost recover_tripleo_nova_virtqemud[78588]: 62551 Oct 14 04:30:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:30:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:30:30 localhost systemd[1]: libpod-1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c.scope: Deactivated successfully. Oct 14 04:30:31 localhost podman[78589]: 2025-10-14 08:30:31.002300808 +0000 UTC m=+0.048791915 container died 1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1, name=rhosp17/openstack-nova-compute, version=17.1.9, config_id=tripleo_step5, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, tcib_managed=true, batch=17.1_20250721.1) Oct 14 04:30:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c-userdata-shm.mount: Deactivated successfully. Oct 14 04:30:31 localhost systemd[1]: var-lib-containers-storage-overlay-9742bc003ef67d7b0fcfbbfc43cb220e80e11c29322c24576aca8c567b6093e5-merged.mount: Deactivated successfully. Oct 14 04:30:31 localhost podman[78589]: 2025-10-14 08:30:31.03563312 +0000 UTC m=+0.082124157 container cleanup 1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, container_name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Oct 14 04:30:31 localhost systemd[1]: libpod-conmon-1cfa0ca4ab729c0c5236a1f4f1a8f080074a2aefce02fc400d99485d2f6f506c.scope: Deactivated successfully. Oct 14 04:30:31 localhost python3[76600]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=4d186a6228facd5bcddf9bcc145eb470 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Oct 14 04:30:31 localhost python3[78643]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:30:31 localhost python3[78659]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 04:30:32 localhost python3[78720]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760430631.9621713-119089-103160577797320/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:30:32 localhost python3[78736]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 04:30:32 localhost systemd[1]: Reloading. Oct 14 04:30:33 localhost systemd-rc-local-generator[78759]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:30:33 localhost systemd-sysv-generator[78763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:30:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:30:34 localhost python3[78788]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 04:30:35 localhost systemd[1]: Reloading. Oct 14 04:30:35 localhost systemd-rc-local-generator[78817]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:30:35 localhost systemd-sysv-generator[78821]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:30:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:30:35 localhost systemd[1]: Starting nova_compute container... Oct 14 04:30:35 localhost tripleo-start-podman-container[78828]: Creating additional drop-in dependency for "nova_compute" (ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2) Oct 14 04:30:35 localhost systemd[1]: Reloading. Oct 14 04:30:35 localhost systemd-sysv-generator[78886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 04:30:35 localhost systemd-rc-local-generator[78881]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 04:30:36 localhost systemd[1]: Started nova_compute container. Oct 14 04:30:36 localhost python3[78925]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:30:38 localhost python3[79046]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005486733 step=5 update_config_hash_only=False Oct 14 04:30:38 localhost python3[79092]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 04:30:39 localhost python3[79122]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Oct 14 04:30:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:30:44 localhost systemd[1]: tmp-crun.XPrLkz.mount: Deactivated successfully. Oct 14 04:30:44 localhost podman[79155]: 2025-10-14 08:30:44.759961233 +0000 UTC m=+0.099039572 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd) Oct 14 04:30:44 localhost podman[79155]: 2025-10-14 08:30:44.774046612 +0000 UTC m=+0.113124951 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:30:44 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:30:46 localhost podman[79176]: 2025-10-14 08:30:46.734749977 +0000 UTC m=+0.078677016 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=iscsid, version=17.1.9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1) Oct 14 04:30:46 localhost podman[79176]: 2025-10-14 08:30:46.741863348 +0000 UTC m=+0.085790397 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:30:46 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:30:51 localhost systemd[1]: tmp-crun.beE2BZ.mount: Deactivated successfully. Oct 14 04:30:51 localhost systemd[1]: tmp-crun.8gLWc7.mount: Deactivated successfully. Oct 14 04:30:51 localhost podman[79203]: 2025-10-14 08:30:51.760397818 +0000 UTC m=+0.093885448 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4) Oct 14 04:30:51 localhost podman[79196]: 2025-10-14 08:30:51.735409883 +0000 UTC m=+0.076686736 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:30:51 localhost podman[79203]: 2025-10-14 08:30:51.79072069 +0000 UTC m=+0.124208320 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, distribution-scope=public) Oct 14 04:30:51 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:30:51 localhost podman[79197]: 2025-10-14 08:30:51.77392024 +0000 UTC m=+0.108542024 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:30:51 localhost podman[79195]: 2025-10-14 08:30:51.792159703 +0000 UTC m=+0.134917559 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, tcib_managed=true, vcs-type=git, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=) Oct 14 04:30:51 localhost podman[79197]: 2025-10-14 08:30:51.856095348 +0000 UTC m=+0.190717102 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, vcs-type=git) Oct 14 04:30:51 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:30:51 localhost podman[79195]: 2025-10-14 08:30:51.872780565 +0000 UTC m=+0.215538381 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, release=1, distribution-scope=public, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc.) Oct 14 04:30:51 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:30:52 localhost podman[79196]: 2025-10-14 08:30:52.12238951 +0000 UTC m=+0.463666433 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Oct 14 04:30:52 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:30:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:30:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:30:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:30:54 localhost podman[79287]: 2025-10-14 08:30:54.73795938 +0000 UTC m=+0.080902930 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4) Oct 14 04:30:54 localhost podman[79287]: 2025-10-14 08:30:54.77923387 +0000 UTC m=+0.122177390 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc.) Oct 14 04:30:54 localhost systemd[1]: tmp-crun.4op0Yv.mount: Deactivated successfully. Oct 14 04:30:54 localhost podman[79288]: 2025-10-14 08:30:54.791119744 +0000 UTC m=+0.131572891 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, vcs-type=git, name=rhosp17/openstack-ovn-controller, release=1, container_name=ovn_controller, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:30:54 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:30:54 localhost podman[79288]: 2025-10-14 08:30:54.812490121 +0000 UTC m=+0.152943278 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9) Oct 14 04:30:54 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:30:54 localhost podman[79289]: 2025-10-14 08:30:54.90177981 +0000 UTC m=+0.239286829 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, container_name=nova_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, tcib_managed=true) Oct 14 04:30:54 localhost podman[79289]: 2025-10-14 08:30:54.930065053 +0000 UTC m=+0.267572032 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, release=1, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, container_name=nova_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:30:54 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:30:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:30:59 localhost systemd[1]: tmp-crun.FAUs1h.mount: Deactivated successfully. Oct 14 04:30:59 localhost podman[79359]: 2025-10-14 08:30:59.736275185 +0000 UTC m=+0.079990965 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.9, container_name=metrics_qdr, maintainer=OpenStack TripleO Team) Oct 14 04:30:59 localhost podman[79359]: 2025-10-14 08:30:59.968712341 +0000 UTC m=+0.312428121 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, tcib_managed=true) Oct 14 04:30:59 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:31:09 localhost sshd[79389]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:31:09 localhost systemd-logind[760]: New session 33 of user zuul. Oct 14 04:31:09 localhost systemd[1]: Started Session 33 of User zuul. Oct 14 04:31:10 localhost python3[79498]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 04:31:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:31:15 localhost podman[79685]: 2025-10-14 08:31:15.769152936 +0000 UTC m=+0.098075362 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.33.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3) Oct 14 04:31:15 localhost podman[79685]: 2025-10-14 08:31:15.781558025 +0000 UTC m=+0.110480451 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, release=2, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, version=17.1.9, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 04:31:15 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:31:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:31:17 localhost systemd[1]: tmp-crun.JCyujP.mount: Deactivated successfully. Oct 14 04:31:17 localhost podman[79782]: 2025-10-14 08:31:17.606753333 +0000 UTC m=+0.122457829 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=iscsid, batch=17.1_20250721.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:31:17 localhost podman[79782]: 2025-10-14 08:31:17.64323887 +0000 UTC m=+0.158943406 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=iscsid, release=1, com.redhat.component=openstack-iscsid-container, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 04:31:17 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:31:17 localhost python3[79783]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Oct 14 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:31:22 localhost podman[79940]: 2025-10-14 08:31:22.298505468 +0000 UTC m=+0.085618702 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:31:22 localhost python3[79936]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Oct 14 04:31:22 localhost podman[79938]: 2025-10-14 08:31:22.344475617 +0000 UTC m=+0.137637971 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, architecture=x86_64) Oct 14 04:31:22 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Oct 14 04:31:22 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 04:31:22 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 04:31:22 localhost podman[79940]: 2025-10-14 08:31:22.346043974 +0000 UTC m=+0.133157218 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, release=1, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12) Oct 14 04:31:22 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Oct 14 04:31:22 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:31:22 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 04:31:22 localhost podman[79937]: 2025-10-14 08:31:22.400054543 +0000 UTC m=+0.193964269 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12) Oct 14 04:31:22 localhost podman[79937]: 2025-10-14 08:31:22.422659086 +0000 UTC m=+0.216568802 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi) Oct 14 04:31:22 localhost podman[79939]: 2025-10-14 08:31:22.33046123 +0000 UTC m=+0.123567142 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, release=1, tcib_managed=true, container_name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible) Oct 14 04:31:22 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:31:22 localhost podman[79939]: 2025-10-14 08:31:22.461267216 +0000 UTC m=+0.254373118 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, tcib_managed=true, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vendor=Red Hat, Inc.) Oct 14 04:31:22 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:31:22 localhost podman[79938]: 2025-10-14 08:31:22.686602958 +0000 UTC m=+0.479765332 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., architecture=x86_64) Oct 14 04:31:22 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:31:25 localhost podman[80057]: 2025-10-14 08:31:25.751957876 +0000 UTC m=+0.083040915 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, release=1, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9) Oct 14 04:31:25 localhost podman[80055]: 2025-10-14 08:31:25.726898349 +0000 UTC m=+0.065830752 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:31:25 localhost podman[80056]: 2025-10-14 08:31:25.789971169 +0000 UTC m=+0.125443329 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, version=17.1.9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Oct 14 04:31:25 localhost podman[80055]: 2025-10-14 08:31:25.810550522 +0000 UTC m=+0.149482945 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, distribution-scope=public) Oct 14 04:31:25 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:31:25 localhost podman[80057]: 2025-10-14 08:31:25.836085273 +0000 UTC m=+0.167168282 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, io.openshift.expose-services=) Oct 14 04:31:25 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:31:25 localhost podman[80056]: 2025-10-14 08:31:25.861984475 +0000 UTC m=+0.197456685 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, version=17.1.9, config_id=tripleo_step4, release=1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64) Oct 14 04:31:25 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:31:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:31:30 localhost podman[80126]: 2025-10-14 08:31:30.745826471 +0000 UTC m=+0.091078824 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, version=17.1.9, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:31:30 localhost podman[80126]: 2025-10-14 08:31:30.947908551 +0000 UTC m=+0.293160894 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:31:30 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:31:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:31:46 localhost podman[80234]: 2025-10-14 08:31:46.749576991 +0000 UTC m=+0.087721425 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, batch=17.1_20250721.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container) Oct 14 04:31:46 localhost podman[80234]: 2025-10-14 08:31:46.757411354 +0000 UTC m=+0.095555818 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd) Oct 14 04:31:46 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:31:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:31:48 localhost podman[80253]: 2025-10-14 08:31:48.75680432 +0000 UTC m=+0.094059973 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, release=1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team) Oct 14 04:31:48 localhost podman[80253]: 2025-10-14 08:31:48.766147529 +0000 UTC m=+0.103403202 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3) Oct 14 04:31:48 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:31:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:31:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:31:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:31:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:31:52 localhost podman[80273]: 2025-10-14 08:31:52.797526453 +0000 UTC m=+0.136602980 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, distribution-scope=public, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Oct 14 04:31:52 localhost podman[80273]: 2025-10-14 08:31:52.834139864 +0000 UTC m=+0.173216371 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, config_id=tripleo_step4, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Oct 14 04:31:52 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:31:52 localhost podman[80318]: 2025-10-14 08:31:52.887549055 +0000 UTC m=+0.095997740 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:31:52 localhost podman[80274]: 2025-10-14 08:31:52.865073485 +0000 UTC m=+0.203317868 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4) Oct 14 04:31:52 localhost podman[80274]: 2025-10-14 08:31:52.949528561 +0000 UTC m=+0.287772944 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:31:52 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:31:52 localhost podman[80272]: 2025-10-14 08:31:52.766256682 +0000 UTC m=+0.109619587 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T15:29:47) Oct 14 04:31:53 localhost podman[80272]: 2025-10-14 08:31:53.000085277 +0000 UTC m=+0.343448122 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12) Oct 14 04:31:53 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:31:53 localhost podman[80318]: 2025-10-14 08:31:53.29832206 +0000 UTC m=+0.506770635 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:31:53 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:31:53 localhost systemd[1]: tmp-crun.KtoYqn.mount: Deactivated successfully. Oct 14 04:31:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:31:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:31:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:31:56 localhost systemd[1]: tmp-crun.07QtHq.mount: Deactivated successfully. Oct 14 04:31:56 localhost podman[80374]: 2025-10-14 08:31:56.759803269 +0000 UTC m=+0.092495356 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_id=tripleo_step5) Oct 14 04:31:56 localhost podman[80374]: 2025-10-14 08:31:56.785061212 +0000 UTC m=+0.117753269 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-07-21T14:48:37, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.9, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:31:56 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:31:56 localhost podman[80372]: 2025-10-14 08:31:56.79878402 +0000 UTC m=+0.136405504 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T16:28:53) Oct 14 04:31:56 localhost podman[80372]: 2025-10-14 08:31:56.844263365 +0000 UTC m=+0.181884879 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:31:56 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:31:56 localhost podman[80373]: 2025-10-14 08:31:56.855902891 +0000 UTC m=+0.191004240 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64) Oct 14 04:31:56 localhost podman[80373]: 2025-10-14 08:31:56.935510963 +0000 UTC m=+0.270612342 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Oct 14 04:31:56 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:32:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:32:01 localhost podman[80450]: 2025-10-14 08:32:01.75191164 +0000 UTC m=+0.088204057 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, release=1) Oct 14 04:32:01 localhost podman[80450]: 2025-10-14 08:32:01.985318073 +0000 UTC m=+0.321610480 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:32:02 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:32:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:32:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:32:17 localhost recover_tripleo_nova_virtqemud[80481]: 62551 Oct 14 04:32:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:32:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:32:17 localhost podman[80479]: 2025-10-14 08:32:17.756803984 +0000 UTC m=+0.092229328 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T13:04:03, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Oct 14 04:32:17 localhost podman[80479]: 2025-10-14 08:32:17.769166533 +0000 UTC m=+0.104591907 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, release=2, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:32:17 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:32:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:32:19 localhost podman[80501]: 2025-10-14 08:32:19.754050257 +0000 UTC m=+0.090729143 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=openstack-iscsid-container) Oct 14 04:32:19 localhost podman[80501]: 2025-10-14 08:32:19.793129611 +0000 UTC m=+0.129808417 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, container_name=iscsid) Oct 14 04:32:19 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:32:22 localhost systemd[1]: session-33.scope: Deactivated successfully. Oct 14 04:32:22 localhost systemd[1]: session-33.scope: Consumed 6.243s CPU time. Oct 14 04:32:22 localhost systemd-logind[760]: Session 33 logged out. Waiting for processes to exit. Oct 14 04:32:22 localhost systemd-logind[760]: Removed session 33. Oct 14 04:32:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:32:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:32:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:32:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:32:23 localhost podman[80566]: 2025-10-14 08:32:23.746723928 +0000 UTC m=+0.079253011 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, distribution-scope=public, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, tcib_managed=true, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:32:23 localhost podman[80566]: 2025-10-14 08:32:23.787162563 +0000 UTC m=+0.119691616 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, architecture=x86_64, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:32:23 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:32:23 localhost systemd[1]: tmp-crun.1bRwE8.mount: Deactivated successfully. Oct 14 04:32:23 localhost podman[80565]: 2025-10-14 08:32:23.812755485 +0000 UTC m=+0.147874626 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, tcib_managed=true, container_name=nova_migration_target, io.buildah.version=1.33.12, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Oct 14 04:32:23 localhost podman[80567]: 2025-10-14 08:32:23.856023394 +0000 UTC m=+0.185149046 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33) Oct 14 04:32:23 localhost podman[80564]: 2025-10-14 08:32:23.904440807 +0000 UTC m=+0.240522256 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Oct 14 04:32:23 localhost podman[80567]: 2025-10-14 08:32:23.920130163 +0000 UTC m=+0.249255805 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, vcs-type=git, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, architecture=x86_64, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:32:23 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:32:23 localhost podman[80564]: 2025-10-14 08:32:23.935928434 +0000 UTC m=+0.272009853 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Oct 14 04:32:23 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:32:24 localhost podman[80565]: 2025-10-14 08:32:24.162243246 +0000 UTC m=+0.497362337 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.33.12) Oct 14 04:32:24 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:32:27 localhost podman[80666]: 2025-10-14 08:32:27.746881682 +0000 UTC m=+0.075907852 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, release=1, version=17.1.9) Oct 14 04:32:27 localhost podman[80664]: 2025-10-14 08:32:27.718924889 +0000 UTC m=+0.055866115 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.33.12, version=17.1.9, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:32:27 localhost podman[80666]: 2025-10-14 08:32:27.777117463 +0000 UTC m=+0.106143633 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, release=1, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:32:27 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:32:27 localhost podman[80664]: 2025-10-14 08:32:27.801220731 +0000 UTC m=+0.138161977 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1) Oct 14 04:32:27 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:32:27 localhost podman[80665]: 2025-10-14 08:32:27.884192852 +0000 UTC m=+0.220072246 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:32:27 localhost podman[80665]: 2025-10-14 08:32:27.939663594 +0000 UTC m=+0.275542998 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, distribution-scope=public, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44) Oct 14 04:32:27 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:32:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:32:32 localhost podman[80734]: 2025-10-14 08:32:32.740288363 +0000 UTC m=+0.083876121 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, version=17.1.9) Oct 14 04:32:32 localhost podman[80734]: 2025-10-14 08:32:32.954126752 +0000 UTC m=+0.297714500 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.9) Oct 14 04:32:32 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:32:34 localhost sshd[80762]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:32:34 localhost systemd-logind[760]: New session 34 of user zuul. Oct 14 04:32:34 localhost systemd[1]: Started Session 34 of User zuul. Oct 14 04:32:35 localhost python3[80781]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:32:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:32:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4460 writes, 20K keys, 4460 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4460 writes, 458 syncs, 9.74 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:32:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:32:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 5034 writes, 22K keys, 5034 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5034 writes, 570 syncs, 8.83 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:32:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:32:48 localhost podman[80860]: 2025-10-14 08:32:48.763637856 +0000 UTC m=+0.101261298 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 04:32:48 localhost podman[80860]: 2025-10-14 08:32:48.782460186 +0000 UTC m=+0.120083638 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:04:03, config_id=tripleo_step3) Oct 14 04:32:48 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:32:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:32:50 localhost podman[80880]: 2025-10-14 08:32:50.740566343 +0000 UTC m=+0.084619642 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:32:50 localhost podman[80880]: 2025-10-14 08:32:50.751778287 +0000 UTC m=+0.095831576 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.9, release=1, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, tcib_managed=true, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:32:50 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:32:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:32:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:32:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:32:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:32:54 localhost systemd[1]: tmp-crun.4xYf6f.mount: Deactivated successfully. Oct 14 04:32:54 localhost podman[80902]: 2025-10-14 08:32:54.728543015 +0000 UTC m=+0.063659797 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, build-date=2025-07-21T14:45:33) Oct 14 04:32:54 localhost podman[80902]: 2025-10-14 08:32:54.744489541 +0000 UTC m=+0.079606313 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.9, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:45:33, distribution-scope=public) Oct 14 04:32:54 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:32:54 localhost podman[80899]: 2025-10-14 08:32:54.782795501 +0000 UTC m=+0.123498940 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:32:54 localhost podman[80901]: 2025-10-14 08:32:54.832759819 +0000 UTC m=+0.168629903 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-type=git, version=17.1.9, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron) Oct 14 04:32:54 localhost podman[80901]: 2025-10-14 08:32:54.843921292 +0000 UTC m=+0.179791386 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, release=1, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true) Oct 14 04:32:54 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:32:54 localhost podman[80899]: 2025-10-14 08:32:54.885770919 +0000 UTC m=+0.226474408 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 04:32:54 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:32:54 localhost podman[80900]: 2025-10-14 08:32:54.885539932 +0000 UTC m=+0.219375395 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, release=1, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Oct 14 04:32:55 localhost podman[80900]: 2025-10-14 08:32:55.232532478 +0000 UTC m=+0.566368021 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:32:55 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:32:55 localhost systemd[1]: tmp-crun.gihgqH.mount: Deactivated successfully. Oct 14 04:32:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:32:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:32:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:32:58 localhost systemd[1]: tmp-crun.rwP1Ni.mount: Deactivated successfully. Oct 14 04:32:58 localhost podman[80994]: 2025-10-14 08:32:58.743055717 +0000 UTC m=+0.082206999 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.9, release=1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent) Oct 14 04:32:58 localhost systemd[1]: tmp-crun.9rRNZ0.mount: Deactivated successfully. Oct 14 04:32:58 localhost podman[80996]: 2025-10-14 08:32:58.796354844 +0000 UTC m=+0.127740146 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Oct 14 04:32:58 localhost podman[80995]: 2025-10-14 08:32:58.845330184 +0000 UTC m=+0.180970632 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, build-date=2025-07-21T13:28:44, tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:32:58 localhost podman[80994]: 2025-10-14 08:32:58.869722141 +0000 UTC m=+0.208873473 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_step4, release=1, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Oct 14 04:32:58 localhost podman[80996]: 2025-10-14 08:32:58.876893524 +0000 UTC m=+0.208278866 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, container_name=nova_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, distribution-scope=public, name=rhosp17/openstack-nova-compute) Oct 14 04:32:58 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:32:58 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:32:58 localhost podman[80995]: 2025-10-14 08:32:58.898098692 +0000 UTC m=+0.233739110 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, distribution-scope=public, architecture=x86_64, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:32:58 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:33:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:33:03 localhost podman[81065]: 2025-10-14 08:33:03.741522392 +0000 UTC m=+0.081598233 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:33:03 localhost podman[81065]: 2025-10-14 08:33:03.93537434 +0000 UTC m=+0.275450191 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, release=1, com.redhat.component=openstack-qdrouterd-container) Oct 14 04:33:03 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:33:04 localhost python3[81108]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Oct 14 04:33:08 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 04:33:08 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 04:33:08 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 04:33:09 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 04:33:09 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 04:33:09 localhost systemd[1]: run-rff7f3ea361394ca0b6c964307cea61c8.service: Deactivated successfully. Oct 14 04:33:09 localhost systemd[1]: run-rd6360e637408472095497e86291b10d3.service: Deactivated successfully. Oct 14 04:33:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:33:19 localhost podman[81260]: 2025-10-14 08:33:19.738874088 +0000 UTC m=+0.081553953 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., release=2, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible) Oct 14 04:33:19 localhost podman[81260]: 2025-10-14 08:33:19.778840578 +0000 UTC m=+0.121520403 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., release=2, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:33:19 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:33:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:33:21 localhost systemd[1]: tmp-crun.Kt36vV.mount: Deactivated successfully. Oct 14 04:33:21 localhost podman[81280]: 2025-10-14 08:33:21.749909065 +0000 UTC m=+0.089252272 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, config_id=tripleo_step3, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:33:21 localhost podman[81280]: 2025-10-14 08:33:21.792459976 +0000 UTC m=+0.131803163 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, container_name=iscsid, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:33:21 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:33:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:33:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:33:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:33:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:33:25 localhost systemd[1]: tmp-crun.siPrBg.mount: Deactivated successfully. Oct 14 04:33:25 localhost podman[81351]: 2025-10-14 08:33:25.755515857 +0000 UTC m=+0.086466854 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Oct 14 04:33:25 localhost podman[81343]: 2025-10-14 08:33:25.731381108 +0000 UTC m=+0.074682229 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.9, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true) Oct 14 04:33:25 localhost podman[81351]: 2025-10-14 08:33:25.801106463 +0000 UTC m=+0.132057430 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, release=1, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible) Oct 14 04:33:25 localhost podman[81344]: 2025-10-14 08:33:25.80133647 +0000 UTC m=+0.135331503 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, architecture=x86_64, io.buildah.version=1.33.12) Oct 14 04:33:25 localhost podman[81345]: 2025-10-14 08:33:25.861902419 +0000 UTC m=+0.192553947 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, release=1, io.buildah.version=1.33.12, container_name=logrotate_crond, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64) Oct 14 04:33:25 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:33:25 localhost podman[81345]: 2025-10-14 08:33:25.896916837 +0000 UTC m=+0.227568375 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, name=rhosp17/openstack-cron, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, release=1, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, container_name=logrotate_crond, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Oct 14 04:33:25 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:33:25 localhost podman[81343]: 2025-10-14 08:33:25.910940562 +0000 UTC m=+0.254241683 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, distribution-scope=public) Oct 14 04:33:25 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:33:26 localhost podman[81344]: 2025-10-14 08:33:26.192865744 +0000 UTC m=+0.526860767 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:33:26 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:33:29 localhost podman[81433]: 2025-10-14 08:33:29.750095477 +0000 UTC m=+0.087477377 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., tcib_managed=true) Oct 14 04:33:29 localhost systemd[1]: tmp-crun.QXai2d.mount: Deactivated successfully. Oct 14 04:33:29 localhost podman[81433]: 2025-10-14 08:33:29.809208642 +0000 UTC m=+0.146590542 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Oct 14 04:33:29 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:33:29 localhost podman[81434]: 2025-10-14 08:33:29.812573987 +0000 UTC m=+0.145918431 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:33:29 localhost podman[81434]: 2025-10-14 08:33:29.892493407 +0000 UTC m=+0.225837831 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, vcs-type=git, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 04:33:29 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:33:29 localhost podman[81435]: 2025-10-14 08:33:29.864995543 +0000 UTC m=+0.195640723 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:33:29 localhost podman[81435]: 2025-10-14 08:33:29.945525254 +0000 UTC m=+0.276170394 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, vcs-type=git, batch=17.1_20250721.1) Oct 14 04:33:29 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:33:30 localhost systemd[1]: tmp-crun.WWiAv9.mount: Deactivated successfully. Oct 14 04:33:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:33:34 localhost podman[81506]: 2025-10-14 08:33:34.740887323 +0000 UTC m=+0.083278696 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Oct 14 04:33:34 localhost podman[81506]: 2025-10-14 08:33:34.958280761 +0000 UTC m=+0.300672154 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1) Oct 14 04:33:34 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:33:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:33:50 localhost podman[81663]: 2025-10-14 08:33:50.759237668 +0000 UTC m=+0.095004921 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, build-date=2025-07-21T13:04:03, distribution-scope=public) Oct 14 04:33:50 localhost podman[81663]: 2025-10-14 08:33:50.777124253 +0000 UTC m=+0.112891556 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 04:33:50 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:33:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:33:52 localhost systemd[1]: tmp-crun.le2JMh.mount: Deactivated successfully. Oct 14 04:33:52 localhost podman[81697]: 2025-10-14 08:33:52.300357567 +0000 UTC m=+0.088084545 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid) Oct 14 04:33:52 localhost podman[81697]: 2025-10-14 08:33:52.313113024 +0000 UTC m=+0.100839982 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, release=1, version=17.1.9, com.redhat.component=openstack-iscsid-container) Oct 14 04:33:52 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:33:52 localhost python3[81696]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:33:56 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 04:33:56 localhost systemd[1]: tmp-crun.HNl9Tb.mount: Deactivated successfully. Oct 14 04:33:56 localhost podman[81843]: 2025-10-14 08:33:56.747380792 +0000 UTC m=+0.078093194 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, container_name=logrotate_crond, version=17.1.9, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 04:33:56 localhost podman[81842]: 2025-10-14 08:33:56.805786735 +0000 UTC m=+0.139354666 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:33:56 localhost podman[81844]: 2025-10-14 08:33:56.877025257 +0000 UTC m=+0.203552130 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:33:56 localhost podman[81844]: 2025-10-14 08:33:56.908155124 +0000 UTC m=+0.234682047 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible) Oct 14 04:33:56 localhost podman[81841]: 2025-10-14 08:33:56.916359538 +0000 UTC m=+0.252033794 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, release=1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:33:56 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:33:56 localhost podman[81843]: 2025-10-14 08:33:56.934625115 +0000 UTC m=+0.265337557 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64) Oct 14 04:33:56 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:33:56 localhost podman[81841]: 2025-10-14 08:33:56.952040055 +0000 UTC m=+0.287714301 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, release=1, config_id=tripleo_step4) Oct 14 04:33:56 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:33:57 localhost podman[81842]: 2025-10-14 08:33:57.175262095 +0000 UTC m=+0.508829946 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Oct 14 04:33:57 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:33:57 localhost systemd[1]: tmp-crun.Otfyh3.mount: Deactivated successfully. Oct 14 04:34:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:34:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:34:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:34:00 localhost podman[81995]: 2025-10-14 08:34:00.749179297 +0000 UTC m=+0.082759560 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute) Oct 14 04:34:00 localhost podman[81995]: 2025-10-14 08:34:00.783232114 +0000 UTC m=+0.116812367 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team) Oct 14 04:34:00 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:34:00 localhost podman[81993]: 2025-10-14 08:34:00.798320712 +0000 UTC m=+0.136250670 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:34:00 localhost podman[81994]: 2025-10-14 08:34:00.842634878 +0000 UTC m=+0.181034220 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:34:00 localhost podman[81994]: 2025-10-14 08:34:00.86844312 +0000 UTC m=+0.206842472 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T13:28:44, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Oct 14 04:34:00 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:34:00 localhost podman[81993]: 2025-10-14 08:34:00.894126077 +0000 UTC m=+0.232056005 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent) Oct 14 04:34:00 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:34:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:34:05 localhost podman[82065]: 2025-10-14 08:34:05.736837845 +0000 UTC m=+0.077524507 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, release=1, architecture=x86_64, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Oct 14 04:34:05 localhost podman[82065]: 2025-10-14 08:34:05.952807669 +0000 UTC m=+0.293494281 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:34:05 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:34:20 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:34:20 localhost recover_tripleo_nova_virtqemud[82095]: 62551 Oct 14 04:34:20 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:34:20 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:34:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:34:21 localhost podman[82096]: 2025-10-14 08:34:21.749952065 +0000 UTC m=+0.089155567 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=2, vcs-type=git) Oct 14 04:34:21 localhost podman[82096]: 2025-10-14 08:34:21.763127765 +0000 UTC m=+0.102331267 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Oct 14 04:34:21 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:34:22 localhost podman[82138]: 2025-10-14 08:34:22.74166036 +0000 UTC m=+0.077815816 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:27:15, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-iscsid-container) Oct 14 04:34:22 localhost podman[82138]: 2025-10-14 08:34:22.752959352 +0000 UTC m=+0.089114808 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, distribution-scope=public, release=1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15) Oct 14 04:34:22 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:34:27 localhost systemd[1]: tmp-crun.APtgpc.mount: Deactivated successfully. Oct 14 04:34:27 localhost systemd[1]: tmp-crun.pEvEYG.mount: Deactivated successfully. Oct 14 04:34:27 localhost podman[82182]: 2025-10-14 08:34:27.806199326 +0000 UTC m=+0.138733758 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Oct 14 04:34:27 localhost podman[82182]: 2025-10-14 08:34:27.813479302 +0000 UTC m=+0.146013724 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, vcs-type=git, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Oct 14 04:34:27 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:34:27 localhost podman[82180]: 2025-10-14 08:34:27.790812288 +0000 UTC m=+0.129285994 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1) Oct 14 04:34:27 localhost podman[82183]: 2025-10-14 08:34:27.862944658 +0000 UTC m=+0.190432473 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git) Oct 14 04:34:27 localhost podman[82180]: 2025-10-14 08:34:27.876479527 +0000 UTC m=+0.214953203 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, distribution-scope=public, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible) Oct 14 04:34:27 localhost podman[82181]: 2025-10-14 08:34:27.826474716 +0000 UTC m=+0.159952068 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.9, tcib_managed=true) Oct 14 04:34:27 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:34:27 localhost podman[82183]: 2025-10-14 08:34:27.897152679 +0000 UTC m=+0.224640524 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, release=1) Oct 14 04:34:27 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:34:28 localhost podman[82181]: 2025-10-14 08:34:28.209389592 +0000 UTC m=+0.542866944 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, release=1, config_id=tripleo_step4) Oct 14 04:34:28 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:34:30 localhost sshd[82274]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:34:31 localhost systemd[1]: tmp-crun.CxC0ol.mount: Deactivated successfully. Oct 14 04:34:31 localhost podman[82276]: 2025-10-14 08:34:31.045695487 +0000 UTC m=+0.107291922 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, version=17.1.9, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:34:31 localhost podman[82277]: 2025-10-14 08:34:31.022091564 +0000 UTC m=+0.085037070 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, container_name=ovn_controller, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:34:31 localhost podman[82277]: 2025-10-14 08:34:31.101789409 +0000 UTC m=+0.164734915 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible) Oct 14 04:34:31 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:34:31 localhost podman[82276]: 2025-10-14 08:34:31.11601983 +0000 UTC m=+0.177615985 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, tcib_managed=true, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:34:31 localhost podman[82278]: 2025-10-14 08:34:31.074944245 +0000 UTC m=+0.133687491 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, container_name=nova_compute, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Oct 14 04:34:31 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:34:31 localhost podman[82278]: 2025-10-14 08:34:31.158764497 +0000 UTC m=+0.217507783 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5) Oct 14 04:34:31 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:34:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:34:36 localhost podman[82349]: 2025-10-14 08:34:36.739498434 +0000 UTC m=+0.083025797 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Oct 14 04:34:36 localhost podman[82349]: 2025-10-14 08:34:36.92201772 +0000 UTC m=+0.265545133 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public) Oct 14 04:34:36 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:34:49 localhost python3[82455]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 04:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:34:52 localhost podman[82592]: 2025-10-14 08:34:52.739354073 +0000 UTC m=+0.076270579 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Oct 14 04:34:52 localhost podman[82592]: 2025-10-14 08:34:52.753010746 +0000 UTC m=+0.089927262 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, container_name=collectd) Oct 14 04:34:52 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:34:52 localhost podman[82612]: 2025-10-14 08:34:52.854974572 +0000 UTC m=+0.064125091 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:34:52 localhost podman[82612]: 2025-10-14 08:34:52.894395616 +0000 UTC m=+0.103546125 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-07-21T13:27:15, release=1, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:34:52 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:34:53 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 04:34:54 localhost rhsm-service[6497]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Oct 14 04:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:34:58 localhost systemd[1]: tmp-crun.x79XSI.mount: Deactivated successfully. Oct 14 04:34:58 localhost podman[82642]: 2025-10-14 08:34:58.769120882 +0000 UTC m=+0.103789773 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, distribution-scope=public, release=1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:34:58 localhost podman[82641]: 2025-10-14 08:34:58.807195504 +0000 UTC m=+0.142309159 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, com.redhat.component=openstack-nova-compute-container) Oct 14 04:34:58 localhost podman[82642]: 2025-10-14 08:34:58.848881227 +0000 UTC m=+0.183550118 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, release=1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron) Oct 14 04:34:58 localhost podman[82640]: 2025-10-14 08:34:58.863938255 +0000 UTC m=+0.201639440 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, tcib_managed=true, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Oct 14 04:34:58 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:34:58 localhost podman[82643]: 2025-10-14 08:34:58.920991246 +0000 UTC m=+0.251234499 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute) Oct 14 04:34:58 localhost podman[82640]: 2025-10-14 08:34:58.925243758 +0000 UTC m=+0.262945003 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Oct 14 04:34:58 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:34:58 localhost podman[82643]: 2025-10-14 08:34:58.9610482 +0000 UTC m=+0.291291423 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:34:58 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:34:59 localhost podman[82641]: 2025-10-14 08:34:59.20653086 +0000 UTC m=+0.541644535 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, distribution-scope=public, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target) Oct 14 04:34:59 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:35:01 localhost podman[82794]: 2025-10-14 08:35:01.77399524 +0000 UTC m=+0.085720452 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, vcs-type=git, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, tcib_managed=true, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:35:01 localhost podman[82794]: 2025-10-14 08:35:01.82424414 +0000 UTC m=+0.135969382 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-07-21T13:28:44, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:35:01 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:35:01 localhost podman[82795]: 2025-10-14 08:35:01.876604035 +0000 UTC m=+0.186576222 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute) Oct 14 04:35:01 localhost podman[82795]: 2025-10-14 08:35:01.906166313 +0000 UTC m=+0.216138500 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, config_id=tripleo_step5, version=17.1.9) Oct 14 04:35:01 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:35:01 localhost podman[82793]: 2025-10-14 08:35:01.828159502 +0000 UTC m=+0.144611191 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ovn_metadata_agent) Oct 14 04:35:01 localhost podman[82793]: 2025-10-14 08:35:01.964255507 +0000 UTC m=+0.280707166 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1) Oct 14 04:35:01 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:35:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:35:07 localhost systemd[1]: tmp-crun.lldpId.mount: Deactivated successfully. Oct 14 04:35:07 localhost podman[82868]: 2025-10-14 08:35:07.754604032 +0000 UTC m=+0.097026563 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, distribution-scope=public, managed_by=tripleo_ansible) Oct 14 04:35:07 localhost podman[82868]: 2025-10-14 08:35:07.996920654 +0000 UTC m=+0.339343185 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, tcib_managed=true, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1) Oct 14 04:35:08 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:35:16 localhost python3[82911]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Oct 14 04:35:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:35:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:35:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:35:23 localhost recover_tripleo_nova_virtqemud[82969]: 62551 Oct 14 04:35:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:35:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:35:23 localhost podman[82957]: 2025-10-14 08:35:23.751976776 +0000 UTC m=+0.084520604 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:35:23 localhost podman[82957]: 2025-10-14 08:35:23.763016979 +0000 UTC m=+0.095560797 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-iscsid-container, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=iscsid, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, maintainer=OpenStack TripleO Team) Oct 14 04:35:23 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:35:23 localhost podman[82956]: 2025-10-14 08:35:23.848870534 +0000 UTC m=+0.183284571 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, vcs-type=git, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Oct 14 04:35:23 localhost podman[82956]: 2025-10-14 08:35:23.882066974 +0000 UTC m=+0.216480991 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step3, release=2, managed_by=tripleo_ansible, container_name=collectd, name=rhosp17/openstack-collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:35:23 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:35:29 localhost podman[82996]: 2025-10-14 08:35:29.741440531 +0000 UTC m=+0.078858789 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:35:29 localhost podman[82997]: 2025-10-14 08:35:29.792828496 +0000 UTC m=+0.127953302 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 04:35:29 localhost podman[82999]: 2025-10-14 08:35:29.860531699 +0000 UTC m=+0.189428892 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=) Oct 14 04:35:29 localhost podman[82998]: 2025-10-14 08:35:29.909432286 +0000 UTC m=+0.240386893 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.33.12) Oct 14 04:35:29 localhost podman[82999]: 2025-10-14 08:35:29.918098425 +0000 UTC m=+0.246995628 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:35:29 localhost podman[82996]: 2025-10-14 08:35:29.929003694 +0000 UTC m=+0.266421892 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc.) Oct 14 04:35:29 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:35:29 localhost podman[82998]: 2025-10-14 08:35:29.942459212 +0000 UTC m=+0.273413819 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true) Oct 14 04:35:29 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:35:29 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:35:30 localhost podman[82997]: 2025-10-14 08:35:30.100985602 +0000 UTC m=+0.436110428 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.9, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:35:30 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:35:32 localhost systemd[1]: tmp-crun.HExAx2.mount: Deactivated successfully. Oct 14 04:35:32 localhost podman[83086]: 2025-10-14 08:35:32.758799776 +0000 UTC m=+0.091549193 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1) Oct 14 04:35:32 localhost podman[83086]: 2025-10-14 08:35:32.809109068 +0000 UTC m=+0.141858465 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, version=17.1.9, release=1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:35:32 localhost podman[83088]: 2025-10-14 08:35:32.816357864 +0000 UTC m=+0.142807525 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:35:32 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:35:32 localhost podman[83088]: 2025-10-14 08:35:32.846566531 +0000 UTC m=+0.173016202 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, release=1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, version=17.1.9, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Oct 14 04:35:32 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:35:32 localhost podman[83087]: 2025-10-14 08:35:32.861965289 +0000 UTC m=+0.190694090 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:35:32 localhost podman[83087]: 2025-10-14 08:35:32.889186444 +0000 UTC m=+0.217915245 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc.) Oct 14 04:35:32 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:35:33 localhost systemd[1]: tmp-crun.YJzEHA.mount: Deactivated successfully. Oct 14 04:35:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:35:38 localhost systemd[1]: tmp-crun.KUl7kA.mount: Deactivated successfully. Oct 14 04:35:38 localhost podman[83159]: 2025-10-14 08:35:38.739029015 +0000 UTC m=+0.078742245 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, version=17.1.9, distribution-scope=public, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, tcib_managed=true) Oct 14 04:35:38 localhost podman[83159]: 2025-10-14 08:35:38.960396157 +0000 UTC m=+0.300109317 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, release=1, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:35:38 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:35:50 localhost podman[83290]: 2025-10-14 08:35:50.110384328 +0000 UTC m=+0.084400521 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, release=553, GIT_CLEAN=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12) Oct 14 04:35:50 localhost podman[83290]: 2025-10-14 08:35:50.213895001 +0000 UTC m=+0.187911204 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True) Oct 14 04:35:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:35:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:35:54 localhost podman[83435]: 2025-10-14 08:35:54.729639821 +0000 UTC m=+0.072413499 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, container_name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=2) Oct 14 04:35:54 localhost podman[83435]: 2025-10-14 08:35:54.767073483 +0000 UTC m=+0.109847161 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, container_name=collectd, release=2, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git) Oct 14 04:35:54 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:35:54 localhost podman[83436]: 2025-10-14 08:35:54.739984682 +0000 UTC m=+0.079346984 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:35:54 localhost podman[83436]: 2025-10-14 08:35:54.825251619 +0000 UTC m=+0.164613941 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, container_name=iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, release=1, vcs-type=git) Oct 14 04:35:54 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:36:00 localhost systemd[1]: tmp-crun.ajYCbW.mount: Deactivated successfully. Oct 14 04:36:00 localhost systemd[1]: tmp-crun.HuB4Bj.mount: Deactivated successfully. Oct 14 04:36:00 localhost podman[83476]: 2025-10-14 08:36:00.773158874 +0000 UTC m=+0.074589316 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, release=1, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:36:00 localhost podman[83476]: 2025-10-14 08:36:00.811101342 +0000 UTC m=+0.112531784 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.9) Oct 14 04:36:00 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:36:00 localhost podman[83475]: 2025-10-14 08:36:00.834598301 +0000 UTC m=+0.175750277 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute) Oct 14 04:36:00 localhost podman[83474]: 2025-10-14 08:36:00.749110008 +0000 UTC m=+0.095386962 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:36:00 localhost podman[83474]: 2025-10-14 08:36:00.885275754 +0000 UTC m=+0.231552788 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:36:00 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:36:00 localhost podman[83493]: 2025-10-14 08:36:00.93697402 +0000 UTC m=+0.234354707 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:45:33, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:36:00 localhost podman[83493]: 2025-10-14 08:36:00.994288958 +0000 UTC m=+0.291669645 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, vendor=Red Hat, Inc.) Oct 14 04:36:01 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:36:01 localhost podman[83475]: 2025-10-14 08:36:01.229092688 +0000 UTC m=+0.570244604 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1) Oct 14 04:36:01 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:36:03 localhost podman[83570]: 2025-10-14 08:36:03.744982836 +0000 UTC m=+0.084705430 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team) Oct 14 04:36:03 localhost podman[83571]: 2025-10-14 08:36:03.797852698 +0000 UTC m=+0.131621317 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, release=1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 04:36:03 localhost podman[83570]: 2025-10-14 08:36:03.819529071 +0000 UTC m=+0.159251645 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public) Oct 14 04:36:03 localhost podman[83571]: 2025-10-14 08:36:03.826165047 +0000 UTC m=+0.159933716 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:36:03 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:36:03 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:36:03 localhost podman[83572]: 2025-10-14 08:36:03.91035165 +0000 UTC m=+0.240815147 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9) Oct 14 04:36:03 localhost podman[83572]: 2025-10-14 08:36:03.963711206 +0000 UTC m=+0.294174713 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1) Oct 14 04:36:03 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:36:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:36:09 localhost podman[83646]: 2025-10-14 08:36:09.742407519 +0000 UTC m=+0.086687132 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, tcib_managed=true, release=1, config_id=tripleo_step1) Oct 14 04:36:09 localhost podman[83646]: 2025-10-14 08:36:09.942216441 +0000 UTC m=+0.286496124 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, release=1, config_id=tripleo_step1, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:36:09 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:36:16 localhost systemd[1]: session-34.scope: Deactivated successfully. Oct 14 04:36:16 localhost systemd[1]: session-34.scope: Consumed 20.827s CPU time. Oct 14 04:36:16 localhost systemd-logind[760]: Session 34 logged out. Waiting for processes to exit. Oct 14 04:36:16 localhost systemd-logind[760]: Removed session 34. Oct 14 04:36:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:36:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:36:25 localhost podman[83720]: 2025-10-14 08:36:25.75899504 +0000 UTC m=+0.088180858 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64) Oct 14 04:36:25 localhost podman[83720]: 2025-10-14 08:36:25.771043884 +0000 UTC m=+0.100229672 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, config_id=tripleo_step3, container_name=iscsid, managed_by=tripleo_ansible, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15) Oct 14 04:36:25 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:36:25 localhost systemd[1]: tmp-crun.FOnKBu.mount: Deactivated successfully. Oct 14 04:36:25 localhost podman[83719]: 2025-10-14 08:36:25.861714589 +0000 UTC m=+0.190927998 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, container_name=collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, release=2, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:36:25 localhost podman[83719]: 2025-10-14 08:36:25.873019449 +0000 UTC m=+0.202232868 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, tcib_managed=true, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team) Oct 14 04:36:25 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:36:31 localhost systemd[1]: tmp-crun.SW6R5y.mount: Deactivated successfully. Oct 14 04:36:31 localhost podman[83759]: 2025-10-14 08:36:31.756635239 +0000 UTC m=+0.095527416 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, version=17.1.9, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:36:31 localhost podman[83761]: 2025-10-14 08:36:31.807024183 +0000 UTC m=+0.137189269 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=1, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:36:31 localhost podman[83761]: 2025-10-14 08:36:31.838022296 +0000 UTC m=+0.168187372 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, distribution-scope=public) Oct 14 04:36:31 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:36:31 localhost podman[83760]: 2025-10-14 08:36:31.856778138 +0000 UTC m=+0.190061881 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:36:31 localhost podman[83760]: 2025-10-14 08:36:31.866001294 +0000 UTC m=+0.199285047 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, release=1, container_name=logrotate_crond, batch=17.1_20250721.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Oct 14 04:36:31 localhost podman[83758]: 2025-10-14 08:36:31.775896867 +0000 UTC m=+0.114627269 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:36:31 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:36:31 localhost podman[83758]: 2025-10-14 08:36:31.90806171 +0000 UTC m=+0.246792092 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:36:31 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:36:32 localhost podman[83759]: 2025-10-14 08:36:32.162924181 +0000 UTC m=+0.501816378 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:36:32 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:36:34 localhost podman[83852]: 2025-10-14 08:36:34.746273455 +0000 UTC m=+0.064610456 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, release=1, io.buildah.version=1.33.12, vcs-type=git) Oct 14 04:36:34 localhost podman[83852]: 2025-10-14 08:36:34.778010131 +0000 UTC m=+0.096347142 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:36:34 localhost podman[83853]: 2025-10-14 08:36:34.812963385 +0000 UTC m=+0.127038154 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.9, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:36:34 localhost podman[83853]: 2025-10-14 08:36:34.835769903 +0000 UTC m=+0.149844692 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, container_name=ovn_controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-type=git, version=17.1.9, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:36:34 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:36:34 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:36:34 localhost podman[83854]: 2025-10-14 08:36:34.918779851 +0000 UTC m=+0.228781234 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, distribution-scope=public, version=17.1.9, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:36:34 localhost podman[83854]: 2025-10-14 08:36:34.971015112 +0000 UTC m=+0.281016475 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9) Oct 14 04:36:34 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:36:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:36:40 localhost podman[83928]: 2025-10-14 08:36:40.74473043 +0000 UTC m=+0.083221185 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.9, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, distribution-scope=public, release=1, architecture=x86_64, container_name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:36:40 localhost podman[83928]: 2025-10-14 08:36:40.953213781 +0000 UTC m=+0.291704566 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, release=1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, container_name=metrics_qdr) Oct 14 04:36:40 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:36:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:36:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:36:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:36:56 localhost recover_tripleo_nova_virtqemud[84047]: 62551 Oct 14 04:36:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:36:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:36:56 localhost systemd[1]: tmp-crun.EKXHNE.mount: Deactivated successfully. Oct 14 04:36:56 localhost podman[84034]: 2025-10-14 08:36:56.74288106 +0000 UTC m=+0.083232805 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.33.12, release=2, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Oct 14 04:36:56 localhost systemd[1]: tmp-crun.TexHwh.mount: Deactivated successfully. Oct 14 04:36:56 localhost podman[84035]: 2025-10-14 08:36:56.756403789 +0000 UTC m=+0.088876289 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, container_name=iscsid, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:36:56 localhost podman[84035]: 2025-10-14 08:36:56.794031557 +0000 UTC m=+0.126504047 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid) Oct 14 04:36:56 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:36:56 localhost podman[84034]: 2025-10-14 08:36:56.809458386 +0000 UTC m=+0.149810181 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, maintainer=OpenStack TripleO Team) Oct 14 04:36:56 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:37:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:37:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:37:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:37:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:37:02 localhost podman[84075]: 2025-10-14 08:37:02.760002695 +0000 UTC m=+0.096776035 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, tcib_managed=true, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:37:02 localhost systemd[1]: tmp-crun.1e0W2j.mount: Deactivated successfully. Oct 14 04:37:02 localhost podman[84073]: 2025-10-14 08:37:02.807834599 +0000 UTC m=+0.149181691 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:37:02 localhost podman[84073]: 2025-10-14 08:37:02.839524714 +0000 UTC m=+0.180871776 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:37:02 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:37:02 localhost podman[84076]: 2025-10-14 08:37:02.860155324 +0000 UTC m=+0.195956834 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, release=1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, container_name=ceilometer_agent_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4) Oct 14 04:37:02 localhost podman[84075]: 2025-10-14 08:37:02.872713574 +0000 UTC m=+0.209486914 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.9, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12) Oct 14 04:37:02 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:37:02 localhost podman[84076]: 2025-10-14 08:37:02.890057082 +0000 UTC m=+0.225858612 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, release=1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:37:02 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:37:02 localhost podman[84074]: 2025-10-14 08:37:02.945137311 +0000 UTC m=+0.284334426 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:37:03 localhost podman[84074]: 2025-10-14 08:37:03.315168409 +0000 UTC m=+0.654365504 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, release=1, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:37:03 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:37:05 localhost podman[84166]: 2025-10-14 08:37:05.728118042 +0000 UTC m=+0.069857239 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, container_name=ovn_controller, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:37:05 localhost podman[84166]: 2025-10-14 08:37:05.786116713 +0000 UTC m=+0.127855900 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, version=17.1.9, tcib_managed=true, container_name=ovn_controller, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=) Oct 14 04:37:05 localhost podman[84165]: 2025-10-14 08:37:05.792046437 +0000 UTC m=+0.132960679 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:37:05 localhost podman[84165]: 2025-10-14 08:37:05.834063301 +0000 UTC m=+0.174977543 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public) Oct 14 04:37:05 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:37:05 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:37:05 localhost podman[84167]: 2025-10-14 08:37:05.858151998 +0000 UTC m=+0.193253270 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, build-date=2025-07-21T14:48:37) Oct 14 04:37:05 localhost podman[84167]: 2025-10-14 08:37:05.887556412 +0000 UTC m=+0.222657634 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T14:48:37, tcib_managed=true, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64) Oct 14 04:37:05 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:37:06 localhost systemd[1]: tmp-crun.uk6BMI.mount: Deactivated successfully. Oct 14 04:37:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:37:11 localhost podman[84240]: 2025-10-14 08:37:11.786801855 +0000 UTC m=+0.114619329 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 04:37:12 localhost podman[84240]: 2025-10-14 08:37:12.001150139 +0000 UTC m=+0.328967593 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.33.12, container_name=metrics_qdr, batch=17.1_20250721.1, release=1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, io.openshift.expose-services=) Oct 14 04:37:12 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:37:20 localhost systemd-logind[760]: Existing logind session ID 28 used by new audit session, ignoring. Oct 14 04:37:20 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 04:37:20 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 04:37:20 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 04:37:20 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 04:37:21 localhost systemd[84643]: Queued start job for default target Main User Target. Oct 14 04:37:21 localhost systemd[84643]: Created slice User Application Slice. Oct 14 04:37:21 localhost systemd[84643]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 04:37:21 localhost systemd[84643]: Started Daily Cleanup of User's Temporary Directories. Oct 14 04:37:21 localhost systemd[84643]: Reached target Paths. Oct 14 04:37:21 localhost systemd[84643]: Reached target Timers. Oct 14 04:37:21 localhost systemd[84643]: Starting D-Bus User Message Bus Socket... Oct 14 04:37:21 localhost systemd[84643]: Starting Create User's Volatile Files and Directories... Oct 14 04:37:21 localhost systemd[84643]: Finished Create User's Volatile Files and Directories. Oct 14 04:37:21 localhost systemd[84643]: Listening on D-Bus User Message Bus Socket. Oct 14 04:37:21 localhost systemd[84643]: Reached target Sockets. Oct 14 04:37:21 localhost systemd[84643]: Reached target Basic System. Oct 14 04:37:21 localhost systemd[1]: Started User Manager for UID 0. Oct 14 04:37:21 localhost systemd[84643]: Reached target Main User Target. Oct 14 04:37:21 localhost systemd[84643]: Startup finished in 186ms. Oct 14 04:37:21 localhost systemd[1]: Started Session c11 of User root. Oct 14 04:37:22 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Oct 14 04:37:22 localhost kernel: device tap3ec9b060-f4 entered promiscuous mode Oct 14 04:37:22 localhost NetworkManager[5977]: [1760431042.1965] manager: (tap3ec9b060-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Oct 14 04:37:22 localhost systemd-udevd[84678]: Network interface NamePolicy= disabled on kernel command line. Oct 14 04:37:22 localhost NetworkManager[5977]: [1760431042.2169] device (tap3ec9b060-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Oct 14 04:37:22 localhost NetworkManager[5977]: [1760431042.2177] device (tap3ec9b060-f4): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Oct 14 04:37:22 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 14 04:37:22 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Oct 14 04:37:22 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Oct 14 04:37:22 localhost systemd-machined[84684]: New machine qemu-1-instance-00000002. Oct 14 04:37:22 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Oct 14 04:37:22 localhost NetworkManager[5977]: [1760431042.4939] manager: (tap7d0cd696-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Oct 14 04:37:22 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap7d0cd696-b1: link becomes ready Oct 14 04:37:22 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap7d0cd696-b0: link becomes ready Oct 14 04:37:22 localhost NetworkManager[5977]: [1760431042.5365] device (tap7d0cd696-b0): carrier: link connected Oct 14 04:37:22 localhost kernel: device tap7d0cd696-b0 entered promiscuous mode Oct 14 04:37:24 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Oct 14 04:37:24 localhost podman[84807]: 2025-10-14 08:37:24.406315263 +0000 UTC m=+0.077195366 container create 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12) Oct 14 04:37:24 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Oct 14 04:37:24 localhost systemd[1]: Started libpod-conmon-21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df.scope. Oct 14 04:37:24 localhost systemd[1]: tmp-crun.OQqKMl.mount: Deactivated successfully. Oct 14 04:37:24 localhost podman[84807]: 2025-10-14 08:37:24.366188438 +0000 UTC m=+0.037068551 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Oct 14 04:37:24 localhost systemd[1]: Started libcrun container. Oct 14 04:37:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/796a4e695fd07aa1ca28abde2b1fa6fa4df60d3155a5cda32e2dcba101be602d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 04:37:24 localhost podman[84807]: 2025-10-14 08:37:24.502554201 +0000 UTC m=+0.173434304 container init 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, version=17.1.9, maintainer=OpenStack TripleO Team, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:37:24 localhost podman[84807]: 2025-10-14 08:37:24.512483619 +0000 UTC m=+0.183363722 container start 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, build-date=2025-07-21T16:28:53, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=) Oct 14 04:37:24 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Oct 14 04:37:24 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Oct 14 04:37:25 localhost setroubleshoot[84784]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 5b2f497b-86cf-43db-9090-4f1c5a3a1db8 Oct 14 04:37:25 localhost setroubleshoot[84784]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Oct 14 04:37:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:37:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:37:27 localhost podman[84892]: 2025-10-14 08:37:27.756776549 +0000 UTC m=+0.087974241 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.9, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Oct 14 04:37:27 localhost podman[84892]: 2025-10-14 08:37:27.765815771 +0000 UTC m=+0.097013443 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.9, batch=17.1_20250721.1, release=1) Oct 14 04:37:27 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:37:27 localhost systemd[1]: tmp-crun.9v03EK.mount: Deactivated successfully. Oct 14 04:37:27 localhost podman[84891]: 2025-10-14 08:37:27.882477211 +0000 UTC m=+0.211514836 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true) Oct 14 04:37:27 localhost podman[84891]: 2025-10-14 08:37:27.892451381 +0000 UTC m=+0.221489016 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, release=2, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team) Oct 14 04:37:27 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:37:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:37:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:37:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:37:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:37:33 localhost systemd[1]: tmp-crun.KPidSs.mount: Deactivated successfully. Oct 14 04:37:33 localhost podman[84932]: 2025-10-14 08:37:33.784236365 +0000 UTC m=+0.117159518 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:37:33 localhost systemd[1]: tmp-crun.zk4WTH.mount: Deactivated successfully. Oct 14 04:37:33 localhost podman[84933]: 2025-10-14 08:37:33.83430809 +0000 UTC m=+0.161687440 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Oct 14 04:37:33 localhost podman[84932]: 2025-10-14 08:37:33.841162392 +0000 UTC m=+0.174085395 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1) Oct 14 04:37:33 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:37:33 localhost podman[84935]: 2025-10-14 08:37:33.92675913 +0000 UTC m=+0.251932132 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute) Oct 14 04:37:33 localhost podman[84934]: 2025-10-14 08:37:33.887944535 +0000 UTC m=+0.214401627 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, vendor=Red Hat, Inc.) Oct 14 04:37:33 localhost podman[84935]: 2025-10-14 08:37:33.96507919 +0000 UTC m=+0.290252192 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible) Oct 14 04:37:33 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:37:34 localhost podman[84934]: 2025-10-14 08:37:34.019921712 +0000 UTC m=+0.346378844 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Oct 14 04:37:34 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:37:34 localhost podman[84933]: 2025-10-14 08:37:34.179964879 +0000 UTC m=+0.507344250 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, release=1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:37:34 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:37:34 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Oct 14 04:37:35 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Oct 14 04:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:37:36 localhost systemd[1]: tmp-crun.vQ3wKI.mount: Deactivated successfully. Oct 14 04:37:36 localhost podman[85024]: 2025-10-14 08:37:36.76066111 +0000 UTC m=+0.094248506 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Oct 14 04:37:36 localhost podman[85026]: 2025-10-14 08:37:36.82024776 +0000 UTC m=+0.145251660 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute) Oct 14 04:37:36 localhost podman[85024]: 2025-10-14 08:37:36.851790779 +0000 UTC m=+0.185378135 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, container_name=ovn_metadata_agent) Oct 14 04:37:36 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:37:36 localhost podman[85026]: 2025-10-14 08:37:36.881283065 +0000 UTC m=+0.206286975 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:37:36 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:37:36 localhost podman[85025]: 2025-10-14 08:37:36.851461379 +0000 UTC m=+0.185030025 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, release=1, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64) Oct 14 04:37:36 localhost podman[85025]: 2025-10-14 08:37:36.934147055 +0000 UTC m=+0.267715651 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc.) Oct 14 04:37:36 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34440 [14/Oct/2025:08:37:40.735] listener listener/metadata 0/0/0/1319/1319 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34454 [14/Oct/2025:08:37:42.162] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34464 [14/Oct/2025:08:37:42.234] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34478 [14/Oct/2025:08:37:42.299] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34494 [14/Oct/2025:08:37:42.358] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34508 [14/Oct/2025:08:37:42.413] listener listener/metadata 0/0/0/11/11 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34520 [14/Oct/2025:08:37:42.467] listener listener/metadata 0/0/0/11/11 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34530 [14/Oct/2025:08:37:42.525] listener listener/metadata 0/0/0/27/27 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34536 [14/Oct/2025:08:37:42.603] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Oct 14 04:37:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34540 [14/Oct/2025:08:37:42.662] listener listener/metadata 0/0/0/12/12 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Oct 14 04:37:42 localhost systemd[1]: tmp-crun.voau0Y.mount: Deactivated successfully. Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34550 [14/Oct/2025:08:37:42.725] listener listener/metadata 0/0/0/11/11 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Oct 14 04:37:42 localhost podman[85098]: 2025-10-14 08:37:42.737802304 +0000 UTC m=+0.083892085 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-07-21T13:07:59, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34558 [14/Oct/2025:08:37:42.768] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34574 [14/Oct/2025:08:37:42.813] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34576 [14/Oct/2025:08:37:42.859] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Oct 14 04:37:42 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34584 [14/Oct/2025:08:37:42.920] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Oct 14 04:37:42 localhost podman[85098]: 2025-10-14 08:37:42.966304478 +0000 UTC m=+0.312394269 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1) Oct 14 04:37:42 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:37:43 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[84828]: 192.168.0.46:34586 [14/Oct/2025:08:37:42.985] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Oct 14 04:37:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:37:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:37:58 localhost podman[85204]: 2025-10-14 08:37:58.741488346 +0000 UTC m=+0.084175394 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=2, build-date=2025-07-21T13:04:03, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b) Oct 14 04:37:58 localhost podman[85205]: 2025-10-14 08:37:58.799824757 +0000 UTC m=+0.136572011 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, version=17.1.9, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64) Oct 14 04:37:58 localhost podman[85205]: 2025-10-14 08:37:58.811962123 +0000 UTC m=+0.148709387 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, container_name=iscsid, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, architecture=x86_64, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, version=17.1.9, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Oct 14 04:37:58 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:37:58 localhost podman[85204]: 2025-10-14 08:37:58.83151087 +0000 UTC m=+0.174197888 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, batch=17.1_20250721.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:37:58 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:38:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:38:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:38:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:38:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:38:04 localhost systemd[1]: tmp-crun.uhuQCY.mount: Deactivated successfully. Oct 14 04:38:04 localhost podman[85243]: 2025-10-14 08:38:04.790777529 +0000 UTC m=+0.132520765 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, release=1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, build-date=2025-07-21T15:29:47, distribution-scope=public) Oct 14 04:38:04 localhost podman[85244]: 2025-10-14 08:38:04.750769957 +0000 UTC m=+0.084945968 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:38:04 localhost podman[85251]: 2025-10-14 08:38:04.769041404 +0000 UTC m=+0.095600829 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.) Oct 14 04:38:04 localhost podman[85243]: 2025-10-14 08:38:04.843217437 +0000 UTC m=+0.184960763 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1) Oct 14 04:38:04 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:38:04 localhost podman[85245]: 2025-10-14 08:38:04.844378812 +0000 UTC m=+0.177878632 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-cron, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 04:38:04 localhost podman[85251]: 2025-10-14 08:38:04.90326748 +0000 UTC m=+0.229826925 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4) Oct 14 04:38:04 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:38:04 localhost podman[85245]: 2025-10-14 08:38:04.929175485 +0000 UTC m=+0.262675325 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, version=17.1.9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 04:38:04 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:38:05 localhost podman[85244]: 2025-10-14 08:38:05.087213671 +0000 UTC m=+0.421389682 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, release=1, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, vcs-type=git, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64) Oct 14 04:38:05 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:38:07 localhost systemd[1]: tmp-crun.E5Lhx3.mount: Deactivated successfully. Oct 14 04:38:07 localhost podman[85335]: 2025-10-14 08:38:07.767296457 +0000 UTC m=+0.106795966 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, version=17.1.9, build-date=2025-07-21T13:28:44, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, release=1) Oct 14 04:38:07 localhost podman[85335]: 2025-10-14 08:38:07.791071924 +0000 UTC m=+0.130571433 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, build-date=2025-07-21T13:28:44, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1) Oct 14 04:38:07 localhost podman[85336]: 2025-10-14 08:38:07.808864727 +0000 UTC m=+0.145359504 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37) Oct 14 04:38:07 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:38:07 localhost podman[85336]: 2025-10-14 08:38:07.86309253 +0000 UTC m=+0.199587286 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:38:07 localhost podman[85334]: 2025-10-14 08:38:07.873549845 +0000 UTC m=+0.213406216 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T16:28:53, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:38:07 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:38:07 localhost podman[85334]: 2025-10-14 08:38:07.926307812 +0000 UTC m=+0.266164203 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git) Oct 14 04:38:07 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:38:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:38:13 localhost systemd[1]: tmp-crun.wYR36f.mount: Deactivated successfully. Oct 14 04:38:13 localhost podman[85404]: 2025-10-14 08:38:13.754926106 +0000 UTC m=+0.098710815 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true) Oct 14 04:38:13 localhost podman[85404]: 2025-10-14 08:38:13.969465606 +0000 UTC m=+0.313250315 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, version=17.1.9, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:38:13 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:38:22 localhost snmpd[68005]: empty variable list in _query Oct 14 04:38:22 localhost snmpd[68005]: empty variable list in _query Oct 14 04:38:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:38:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:38:29 localhost systemd[1]: tmp-crun.D1escx.mount: Deactivated successfully. Oct 14 04:38:29 localhost podman[85478]: 2025-10-14 08:38:29.75947441 +0000 UTC m=+0.094905247 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, com.redhat.component=openstack-collectd-container, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=2, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:38:29 localhost podman[85478]: 2025-10-14 08:38:29.813929351 +0000 UTC m=+0.149360168 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, version=17.1.9, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3) Oct 14 04:38:29 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:38:29 localhost podman[85479]: 2025-10-14 08:38:29.941702387 +0000 UTC m=+0.272661635 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, version=17.1.9, container_name=iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:38:29 localhost podman[85479]: 2025-10-14 08:38:29.949503619 +0000 UTC m=+0.280462857 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Oct 14 04:38:29 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:38:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:38:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:38:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:38:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:38:35 localhost podman[85518]: 2025-10-14 08:38:35.730938557 +0000 UTC m=+0.078973963 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:38:35 localhost podman[85518]: 2025-10-14 08:38:35.792002902 +0000 UTC m=+0.140038328 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 04:38:35 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:38:35 localhost podman[85519]: 2025-10-14 08:38:35.836025148 +0000 UTC m=+0.175125257 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:38:35 localhost podman[85526]: 2025-10-14 08:38:35.793863039 +0000 UTC m=+0.130076818 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:38:35 localhost podman[85526]: 2025-10-14 08:38:35.882551733 +0000 UTC m=+0.218765492 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, tcib_managed=true, build-date=2025-07-21T14:45:33, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:38:35 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:38:35 localhost podman[85520]: 2025-10-14 08:38:35.90341367 +0000 UTC m=+0.240462616 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, release=1, io.openshift.expose-services=, name=rhosp17/openstack-cron, vcs-type=git, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:38:35 localhost podman[85520]: 2025-10-14 08:38:35.93914815 +0000 UTC m=+0.276197056 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1, config_id=tripleo_step4) Oct 14 04:38:35 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:38:36 localhost podman[85519]: 2025-10-14 08:38:36.231531585 +0000 UTC m=+0.570631704 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:38:36 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:38:36 localhost systemd[1]: tmp-crun.1J9PqP.mount: Deactivated successfully. Oct 14 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:38:38 localhost systemd[1]: tmp-crun.fkFg6I.mount: Deactivated successfully. Oct 14 04:38:38 localhost podman[85613]: 2025-10-14 08:38:38.758884369 +0000 UTC m=+0.096424355 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:38:38 localhost systemd[1]: tmp-crun.xLL7E8.mount: Deactivated successfully. Oct 14 04:38:38 localhost podman[85613]: 2025-10-14 08:38:38.810924934 +0000 UTC m=+0.148464920 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:38:38 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:38:38 localhost podman[85615]: 2025-10-14 08:38:38.861920787 +0000 UTC m=+0.189106721 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.33.12) Oct 14 04:38:38 localhost podman[85614]: 2025-10-14 08:38:38.815805056 +0000 UTC m=+0.145001763 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, container_name=ovn_controller, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:38:38 localhost podman[85614]: 2025-10-14 08:38:38.900127933 +0000 UTC m=+0.229324670 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:38:38 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:38:38 localhost podman[85615]: 2025-10-14 08:38:38.913910041 +0000 UTC m=+0.241095955 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, container_name=nova_compute, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:38:38 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:38:40 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:38:40 localhost recover_tripleo_nova_virtqemud[85684]: 62551 Oct 14 04:38:40 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:38:40 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:38:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:38:44 localhost podman[85686]: 2025-10-14 08:38:44.768320327 +0000 UTC m=+0.093967539 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, release=1, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git) Oct 14 04:38:44 localhost podman[85686]: 2025-10-14 08:38:44.979483361 +0000 UTC m=+0.305130573 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.buildah.version=1.33.12, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc.) Oct 14 04:38:44 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:39:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:39:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:39:00 localhost podman[85792]: 2025-10-14 08:39:00.757963621 +0000 UTC m=+0.099308085 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true) Oct 14 04:39:00 localhost podman[85792]: 2025-10-14 08:39:00.803635048 +0000 UTC m=+0.144979452 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, architecture=x86_64, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.33.12) Oct 14 04:39:00 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:39:00 localhost systemd[1]: tmp-crun.ZwGFQp.mount: Deactivated successfully. Oct 14 04:39:00 localhost podman[85793]: 2025-10-14 08:39:00.907910085 +0000 UTC m=+0.246153093 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public) Oct 14 04:39:00 localhost podman[85793]: 2025-10-14 08:39:00.918293807 +0000 UTC m=+0.256536855 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, batch=17.1_20250721.1, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, vendor=Red Hat, Inc.) Oct 14 04:39:00 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:39:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:39:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:39:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:39:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:39:06 localhost podman[85832]: 2025-10-14 08:39:06.734762466 +0000 UTC m=+0.080235343 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, version=17.1.9, vcs-type=git, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:39:06 localhost podman[85832]: 2025-10-14 08:39:06.815557974 +0000 UTC m=+0.161030841 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 04:39:06 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:39:06 localhost podman[85838]: 2025-10-14 08:39:06.782024123 +0000 UTC m=+0.114696392 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc.) Oct 14 04:39:06 localhost podman[85833]: 2025-10-14 08:39:06.858326641 +0000 UTC m=+0.198461112 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, architecture=x86_64, name=rhosp17/openstack-nova-compute) Oct 14 04:39:06 localhost podman[85838]: 2025-10-14 08:39:06.866161955 +0000 UTC m=+0.198834224 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:39:06 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:39:06 localhost podman[85834]: 2025-10-14 08:39:06.822505069 +0000 UTC m=+0.161583306 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Oct 14 04:39:06 localhost podman[85834]: 2025-10-14 08:39:06.95397304 +0000 UTC m=+0.293051267 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, release=1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, distribution-scope=public) Oct 14 04:39:06 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:39:07 localhost podman[85833]: 2025-10-14 08:39:07.277253936 +0000 UTC m=+0.617388467 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Oct 14 04:39:07 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:39:07 localhost systemd[1]: tmp-crun.gMzAIL.mount: Deactivated successfully. Oct 14 04:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:39:09 localhost systemd[1]: tmp-crun.3JRVAZ.mount: Deactivated successfully. Oct 14 04:39:09 localhost podman[85931]: 2025-10-14 08:39:09.766014113 +0000 UTC m=+0.099646964 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:39:09 localhost podman[85931]: 2025-10-14 08:39:09.79909523 +0000 UTC m=+0.132728121 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=) Oct 14 04:39:09 localhost podman[85930]: 2025-10-14 08:39:09.810159243 +0000 UTC m=+0.143347330 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 04:39:09 localhost systemd[1]: tmp-crun.kbazXD.mount: Deactivated successfully. Oct 14 04:39:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:39:09 localhost podman[85930]: 2025-10-14 08:39:09.840380202 +0000 UTC m=+0.173568259 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, release=1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:39:09 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:39:09 localhost podman[85929]: 2025-10-14 08:39:09.915525983 +0000 UTC m=+0.254359126 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=) Oct 14 04:39:09 localhost podman[85929]: 2025-10-14 08:39:09.988282752 +0000 UTC m=+0.327115875 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:39:10 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:39:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:39:15 localhost podman[86002]: 2025-10-14 08:39:15.744394575 +0000 UTC m=+0.086509977 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Oct 14 04:39:15 localhost podman[86002]: 2025-10-14 08:39:15.944047392 +0000 UTC m=+0.286162734 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, tcib_managed=true, version=17.1.9, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20250721.1) Oct 14 04:39:15 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:39:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:39:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:39:31 localhost systemd[1]: tmp-crun.J7CJVX.mount: Deactivated successfully. Oct 14 04:39:31 localhost podman[86075]: 2025-10-14 08:39:31.740828992 +0000 UTC m=+0.087813217 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, tcib_managed=true, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:39:31 localhost podman[86075]: 2025-10-14 08:39:31.752945679 +0000 UTC m=+0.099929924 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=17.1.9, release=2, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Oct 14 04:39:31 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:39:31 localhost podman[86076]: 2025-10-14 08:39:31.843757627 +0000 UTC m=+0.183290530 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, container_name=iscsid, architecture=x86_64) Oct 14 04:39:31 localhost podman[86076]: 2025-10-14 08:39:31.853145558 +0000 UTC m=+0.192678471 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.33.12) Oct 14 04:39:31 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:39:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:39:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:39:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:39:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:39:37 localhost systemd[1]: tmp-crun.eUUcNv.mount: Deactivated successfully. Oct 14 04:39:37 localhost podman[86113]: 2025-10-14 08:39:37.755268604 +0000 UTC m=+0.095340101 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Oct 14 04:39:37 localhost podman[86113]: 2025-10-14 08:39:37.78703805 +0000 UTC m=+0.127109537 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:39:37 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:39:37 localhost podman[86115]: 2025-10-14 08:39:37.811407266 +0000 UTC m=+0.145893299 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-type=git) Oct 14 04:39:37 localhost podman[86114]: 2025-10-14 08:39:37.847543578 +0000 UTC m=+0.184492018 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible) Oct 14 04:39:37 localhost podman[86116]: 2025-10-14 08:39:37.903911268 +0000 UTC m=+0.236168842 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true) Oct 14 04:39:37 localhost podman[86115]: 2025-10-14 08:39:37.925627692 +0000 UTC m=+0.260113745 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, name=rhosp17/openstack-cron, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Oct 14 04:39:37 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:39:38 localhost podman[86116]: 2025-10-14 08:39:38.013450028 +0000 UTC m=+0.345707272 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1) Oct 14 04:39:38 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:39:38 localhost podman[86114]: 2025-10-14 08:39:38.286309229 +0000 UTC m=+0.623257659 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:39:38 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:39:38 localhost systemd[1]: tmp-crun.4DCg42.mount: Deactivated successfully. Oct 14 04:39:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:39:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:39:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:39:40 localhost systemd[1]: tmp-crun.rbQchq.mount: Deactivated successfully. Oct 14 04:39:40 localhost systemd[1]: tmp-crun.dXwMUS.mount: Deactivated successfully. Oct 14 04:39:40 localhost podman[86211]: 2025-10-14 08:39:40.82158962 +0000 UTC m=+0.145890420 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9) Oct 14 04:39:40 localhost podman[86210]: 2025-10-14 08:39:40.784091205 +0000 UTC m=+0.113524395 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44) Oct 14 04:39:40 localhost podman[86209]: 2025-10-14 08:39:40.862401776 +0000 UTC m=+0.195852491 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, release=1, container_name=ovn_metadata_agent, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:39:40 localhost podman[86210]: 2025-10-14 08:39:40.868348751 +0000 UTC m=+0.197781941 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1) Oct 14 04:39:40 localhost podman[86211]: 2025-10-14 08:39:40.877151604 +0000 UTC m=+0.201452454 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, tcib_managed=true, architecture=x86_64, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Oct 14 04:39:40 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:39:40 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:39:40 localhost podman[86209]: 2025-10-14 08:39:40.916240298 +0000 UTC m=+0.249691003 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, vcs-type=git, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:39:40 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:39:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:39:46 localhost podman[86282]: 2025-10-14 08:39:46.745772488 +0000 UTC m=+0.086857017 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:39:46 localhost podman[86282]: 2025-10-14 08:39:46.933378412 +0000 UTC m=+0.274462941 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, vcs-type=git, release=1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:39:46 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:40:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:40:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:40:02 localhost systemd[1]: tmp-crun.2sQgXu.mount: Deactivated successfully. Oct 14 04:40:02 localhost podman[86389]: 2025-10-14 08:40:02.79132353 +0000 UTC m=+0.124775024 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:40:02 localhost podman[86389]: 2025-10-14 08:40:02.805019765 +0000 UTC m=+0.138471269 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, vcs-type=git, architecture=x86_64, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1) Oct 14 04:40:02 localhost podman[86388]: 2025-10-14 08:40:02.760131822 +0000 UTC m=+0.098422566 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible) Oct 14 04:40:02 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:40:02 localhost podman[86388]: 2025-10-14 08:40:02.840205838 +0000 UTC m=+0.178496552 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.9, com.redhat.component=openstack-collectd-container, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.openshift.expose-services=) Oct 14 04:40:02 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:40:08 localhost podman[86426]: 2025-10-14 08:40:08.75120741 +0000 UTC m=+0.088663763 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:40:08 localhost systemd[1]: tmp-crun.ePO86l.mount: Deactivated successfully. Oct 14 04:40:08 localhost podman[86425]: 2025-10-14 08:40:08.811554303 +0000 UTC m=+0.150355208 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git) Oct 14 04:40:08 localhost podman[86427]: 2025-10-14 08:40:08.852162584 +0000 UTC m=+0.186769919 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, architecture=x86_64) Oct 14 04:40:08 localhost podman[86425]: 2025-10-14 08:40:08.870158682 +0000 UTC m=+0.208959557 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi) Oct 14 04:40:08 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:40:08 localhost podman[86427]: 2025-10-14 08:40:08.889106861 +0000 UTC m=+0.223714176 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, release=1, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Oct 14 04:40:08 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:40:08 localhost podman[86428]: 2025-10-14 08:40:08.95801606 +0000 UTC m=+0.288735555 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, release=1, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1) Oct 14 04:40:08 localhost podman[86428]: 2025-10-14 08:40:08.993278264 +0000 UTC m=+0.323997759 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, tcib_managed=true, release=1) Oct 14 04:40:09 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:40:09 localhost podman[86426]: 2025-10-14 08:40:09.121434993 +0000 UTC m=+0.458882046 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step4, container_name=nova_migration_target, release=1) Oct 14 04:40:09 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:40:11 localhost podman[86520]: 2025-10-14 08:40:11.733574219 +0000 UTC m=+0.077030592 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., release=1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Oct 14 04:40:11 localhost systemd[1]: tmp-crun.F2oUOq.mount: Deactivated successfully. Oct 14 04:40:11 localhost podman[86521]: 2025-10-14 08:40:11.757322426 +0000 UTC m=+0.096219248 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, release=1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:40:11 localhost podman[86521]: 2025-10-14 08:40:11.808558567 +0000 UTC m=+0.147455409 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:40:11 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:40:11 localhost podman[86520]: 2025-10-14 08:40:11.82349228 +0000 UTC m=+0.166948613 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:40:11 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:40:11 localhost podman[86522]: 2025-10-14 08:40:11.809050142 +0000 UTC m=+0.145578660 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:40:11 localhost podman[86522]: 2025-10-14 08:40:11.893435851 +0000 UTC m=+0.229964369 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container) Oct 14 04:40:11 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:40:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:40:17 localhost systemd[1]: tmp-crun.E4AIvw.mount: Deactivated successfully. Oct 14 04:40:17 localhost podman[86595]: 2025-10-14 08:40:17.750377424 +0000 UTC m=+0.091229833 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, release=1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, architecture=x86_64, container_name=metrics_qdr, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, config_id=tripleo_step1) Oct 14 04:40:17 localhost podman[86595]: 2025-10-14 08:40:17.941029602 +0000 UTC m=+0.281882021 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, tcib_managed=true, architecture=x86_64, version=17.1.9, release=1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12) Oct 14 04:40:17 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:40:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:40:30 localhost recover_tripleo_nova_virtqemud[86669]: 62551 Oct 14 04:40:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:40:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:40:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:40:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:40:33 localhost podman[86671]: 2025-10-14 08:40:33.739520756 +0000 UTC m=+0.080861221 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:40:33 localhost podman[86671]: 2025-10-14 08:40:33.746702689 +0000 UTC m=+0.088043174 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T13:27:15, container_name=iscsid, release=1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1) Oct 14 04:40:33 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:40:33 localhost podman[86670]: 2025-10-14 08:40:33.830834891 +0000 UTC m=+0.172867107 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12) Oct 14 04:40:33 localhost podman[86670]: 2025-10-14 08:40:33.866547249 +0000 UTC m=+0.208579495 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, version=17.1.9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, container_name=collectd) Oct 14 04:40:33 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:40:39 localhost podman[86712]: 2025-10-14 08:40:39.742045876 +0000 UTC m=+0.076744513 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.buildah.version=1.33.12, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Oct 14 04:40:39 localhost podman[86711]: 2025-10-14 08:40:39.754223334 +0000 UTC m=+0.088164068 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, release=1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T13:07:52) Oct 14 04:40:39 localhost podman[86711]: 2025-10-14 08:40:39.75922133 +0000 UTC m=+0.093162084 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, architecture=x86_64, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, release=1, batch=17.1_20250721.1, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container) Oct 14 04:40:39 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:40:39 localhost podman[86712]: 2025-10-14 08:40:39.771004075 +0000 UTC m=+0.105702722 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:40:39 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:40:39 localhost podman[86710]: 2025-10-14 08:40:39.82303219 +0000 UTC m=+0.160157613 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1) Oct 14 04:40:39 localhost podman[86709]: 2025-10-14 08:40:39.912437925 +0000 UTC m=+0.252896981 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 04:40:39 localhost podman[86709]: 2025-10-14 08:40:39.969236428 +0000 UTC m=+0.309695464 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64) Oct 14 04:40:39 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:40:40 localhost podman[86710]: 2025-10-14 08:40:40.203137629 +0000 UTC m=+0.540263012 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, distribution-scope=public, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, version=17.1.9, io.buildah.version=1.33.12, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git) Oct 14 04:40:40 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:40:40 localhost systemd[1]: tmp-crun.rUzlah.mount: Deactivated successfully. Oct 14 04:40:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:40:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:40:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:40:42 localhost podman[86806]: 2025-10-14 08:40:42.747240476 +0000 UTC m=+0.082151261 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Oct 14 04:40:42 localhost systemd[1]: tmp-crun.A17coN.mount: Deactivated successfully. Oct 14 04:40:42 localhost podman[86804]: 2025-10-14 08:40:42.798642791 +0000 UTC m=+0.135371603 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:40:42 localhost podman[86804]: 2025-10-14 08:40:42.83048805 +0000 UTC m=+0.167216862 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.openshift.expose-services=) Oct 14 04:40:42 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:40:42 localhost podman[86805]: 2025-10-14 08:40:42.846085204 +0000 UTC m=+0.181594697 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, distribution-scope=public, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:40:42 localhost podman[86806]: 2025-10-14 08:40:42.869903644 +0000 UTC m=+0.204814409 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, release=1, vendor=Red Hat, Inc.) Oct 14 04:40:42 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:40:42 localhost podman[86805]: 2025-10-14 08:40:42.887749437 +0000 UTC m=+0.223258840 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, container_name=ovn_controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Oct 14 04:40:42 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:40:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:40:48 localhost podman[86876]: 2025-10-14 08:40:48.725407343 +0000 UTC m=+0.073649237 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true) Oct 14 04:40:48 localhost podman[86876]: 2025-10-14 08:40:48.921130318 +0000 UTC m=+0.269372152 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, io.openshift.expose-services=, release=1, tcib_managed=true, io.buildah.version=1.33.12, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1) Oct 14 04:40:48 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:41:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:41:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:41:04 localhost podman[86981]: 2025-10-14 08:41:04.739743047 +0000 UTC m=+0.080677306 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, build-date=2025-07-21T13:27:15, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible) Oct 14 04:41:04 localhost podman[86981]: 2025-10-14 08:41:04.751610425 +0000 UTC m=+0.092544744 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:41:04 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:41:04 localhost podman[86980]: 2025-10-14 08:41:04.879945139 +0000 UTC m=+0.220464435 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, config_id=tripleo_step3, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc.) Oct 14 04:41:04 localhost podman[86980]: 2025-10-14 08:41:04.923190861 +0000 UTC m=+0.263710117 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, release=2, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3) Oct 14 04:41:04 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:41:10 localhost systemd[1]: tmp-crun.iqkHqp.mount: Deactivated successfully. Oct 14 04:41:10 localhost podman[87022]: 2025-10-14 08:41:10.74836558 +0000 UTC m=+0.081335776 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:41:10 localhost systemd[1]: tmp-crun.4djV3r.mount: Deactivated successfully. Oct 14 04:41:10 localhost podman[87020]: 2025-10-14 08:41:10.81378634 +0000 UTC m=+0.150607906 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.33.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, release=1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64) Oct 14 04:41:10 localhost podman[87021]: 2025-10-14 08:41:10.856855167 +0000 UTC m=+0.191472325 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1) Oct 14 04:41:10 localhost podman[87022]: 2025-10-14 08:41:10.877929751 +0000 UTC m=+0.210899997 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:41:10 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:41:10 localhost podman[87021]: 2025-10-14 08:41:10.893173404 +0000 UTC m=+0.227790552 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.9, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron) Oct 14 04:41:10 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:41:10 localhost podman[87019]: 2025-10-14 08:41:10.966259023 +0000 UTC m=+0.302736579 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:41:10 localhost podman[87019]: 2025-10-14 08:41:10.991815306 +0000 UTC m=+0.328292942 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:41:11 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:41:11 localhost podman[87020]: 2025-10-14 08:41:11.207647196 +0000 UTC m=+0.544468832 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:41:11 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:41:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:41:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:41:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:41:13 localhost systemd[1]: tmp-crun.zxQksy.mount: Deactivated successfully. Oct 14 04:41:13 localhost podman[87116]: 2025-10-14 08:41:13.790277446 +0000 UTC m=+0.128349764 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64) Oct 14 04:41:13 localhost podman[87118]: 2025-10-14 08:41:13.752070311 +0000 UTC m=+0.088901941 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:41:13 localhost podman[87118]: 2025-10-14 08:41:13.831434734 +0000 UTC m=+0.168266414 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37) Oct 14 04:41:13 localhost podman[87117]: 2025-10-14 08:41:13.842093095 +0000 UTC m=+0.179838813 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:41:13 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:41:13 localhost podman[87116]: 2025-10-14 08:41:13.8596527 +0000 UTC m=+0.197724998 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:41:13 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:41:13 localhost podman[87117]: 2025-10-14 08:41:13.888900679 +0000 UTC m=+0.226646427 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.9, container_name=ovn_controller, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, managed_by=tripleo_ansible) Oct 14 04:41:13 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:41:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:41:19 localhost podman[87187]: 2025-10-14 08:41:19.749207118 +0000 UTC m=+0.090906834 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T13:07:59, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:41:19 localhost podman[87187]: 2025-10-14 08:41:19.953957974 +0000 UTC m=+0.295657640 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, release=1, io.buildah.version=1.33.12, distribution-scope=public, container_name=metrics_qdr) Oct 14 04:41:19 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:41:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:41:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:41:35 localhost systemd[1]: tmp-crun.JwX10r.mount: Deactivated successfully. Oct 14 04:41:35 localhost podman[87259]: 2025-10-14 08:41:35.72875287 +0000 UTC m=+0.070736051 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=2, vcs-type=git, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container) Oct 14 04:41:35 localhost podman[87259]: 2025-10-14 08:41:35.740012119 +0000 UTC m=+0.081995280 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:41:35 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:41:35 localhost podman[87260]: 2025-10-14 08:41:35.824431381 +0000 UTC m=+0.163178821 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vcs-type=git, release=1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:41:35 localhost podman[87260]: 2025-10-14 08:41:35.830784378 +0000 UTC m=+0.169531798 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, io.buildah.version=1.33.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=) Oct 14 04:41:35 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:41:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:41:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:41:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:41:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:41:41 localhost systemd[1]: tmp-crun.wJR47Q.mount: Deactivated successfully. Oct 14 04:41:41 localhost systemd[1]: tmp-crun.36sby2.mount: Deactivated successfully. Oct 14 04:41:41 localhost podman[87298]: 2025-10-14 08:41:41.749512603 +0000 UTC m=+0.087015444 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, release=1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:41:41 localhost podman[87295]: 2025-10-14 08:41:41.71969478 +0000 UTC m=+0.066674355 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team) Oct 14 04:41:41 localhost podman[87297]: 2025-10-14 08:41:41.793239255 +0000 UTC m=+0.132106029 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52) Oct 14 04:41:41 localhost podman[87297]: 2025-10-14 08:41:41.831153619 +0000 UTC m=+0.170020383 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc.) Oct 14 04:41:41 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:41:41 localhost podman[87295]: 2025-10-14 08:41:41.907195972 +0000 UTC m=+0.254175587 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Oct 14 04:41:41 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:41:41 localhost podman[87298]: 2025-10-14 08:41:41.925997745 +0000 UTC m=+0.263500586 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4) Oct 14 04:41:41 localhost podman[87296]: 2025-10-14 08:41:41.883541081 +0000 UTC m=+0.222618031 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Oct 14 04:41:41 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:41:42 localhost podman[87296]: 2025-10-14 08:41:42.272106086 +0000 UTC m=+0.611182986 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, version=17.1.9, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:41:42 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:41:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:41:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:41:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:41:44 localhost systemd[1]: tmp-crun.JtLwpl.mount: Deactivated successfully. Oct 14 04:41:44 localhost systemd[1]: tmp-crun.mv4w9G.mount: Deactivated successfully. Oct 14 04:41:44 localhost podman[87387]: 2025-10-14 08:41:44.732434851 +0000 UTC m=+0.071106883 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, architecture=x86_64) Oct 14 04:41:44 localhost podman[87388]: 2025-10-14 08:41:44.792874611 +0000 UTC m=+0.131441260 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:41:44 localhost podman[87388]: 2025-10-14 08:41:44.810503167 +0000 UTC m=+0.149069786 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, container_name=ovn_controller, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, vcs-type=git, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:41:44 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:41:44 localhost podman[87389]: 2025-10-14 08:41:44.766575847 +0000 UTC m=+0.099337646 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., container_name=nova_compute, distribution-scope=public, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, tcib_managed=true, release=1, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible) Oct 14 04:41:44 localhost podman[87389]: 2025-10-14 08:41:44.899186571 +0000 UTC m=+0.231948390 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, distribution-scope=public, version=17.1.9, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 04:41:44 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:41:44 localhost podman[87387]: 2025-10-14 08:41:44.919331724 +0000 UTC m=+0.258003776 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, architecture=x86_64) Oct 14 04:41:44 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:41:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:41:50 localhost podman[87460]: 2025-10-14 08:41:50.750771522 +0000 UTC m=+0.091776481 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:41:50 localhost podman[87460]: 2025-10-14 08:41:50.920899808 +0000 UTC m=+0.261904757 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible) Oct 14 04:41:50 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:42:01 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:42:01 localhost recover_tripleo_nova_virtqemud[87566]: 62551 Oct 14 04:42:01 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:42:01 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:42:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:42:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:42:06 localhost podman[87568]: 2025-10-14 08:42:06.747795988 +0000 UTC m=+0.083411531 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:42:06 localhost podman[87568]: 2025-10-14 08:42:06.790296795 +0000 UTC m=+0.125912418 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9) Oct 14 04:42:06 localhost systemd[1]: tmp-crun.ARB9g7.mount: Deactivated successfully. Oct 14 04:42:06 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:42:06 localhost podman[87567]: 2025-10-14 08:42:06.808021973 +0000 UTC m=+0.143519303 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=2, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:42:06 localhost podman[87567]: 2025-10-14 08:42:06.892406394 +0000 UTC m=+0.227903734 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3) Oct 14 04:42:06 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:42:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:42:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:42:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:42:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:42:12 localhost systemd[1]: tmp-crun.hSj6It.mount: Deactivated successfully. Oct 14 04:42:12 localhost systemd[1]: tmp-crun.lKcGKX.mount: Deactivated successfully. Oct 14 04:42:12 localhost podman[87607]: 2025-10-14 08:42:12.751068372 +0000 UTC m=+0.084910249 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47) Oct 14 04:42:12 localhost podman[87609]: 2025-10-14 08:42:12.828177859 +0000 UTC m=+0.151657925 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:42:12 localhost podman[87613]: 2025-10-14 08:42:12.878743504 +0000 UTC m=+0.203897932 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T14:45:33, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:42:12 localhost podman[87608]: 2025-10-14 08:42:12.79204213 +0000 UTC m=+0.120234382 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, release=1, batch=17.1_20250721.1, vcs-type=git, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc.) Oct 14 04:42:12 localhost podman[87609]: 2025-10-14 08:42:12.894648276 +0000 UTC m=+0.218128342 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, release=1, config_id=tripleo_step4, name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 04:42:12 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:42:12 localhost podman[87607]: 2025-10-14 08:42:12.936167411 +0000 UTC m=+0.270009288 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:42:12 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:42:12 localhost podman[87613]: 2025-10-14 08:42:12.955298333 +0000 UTC m=+0.280452771 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9) Oct 14 04:42:12 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:42:12 localhost sshd[87701]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:42:13 localhost podman[87608]: 2025-10-14 08:42:13.156926633 +0000 UTC m=+0.485118875 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=nova_migration_target, release=1, tcib_managed=true, config_id=tripleo_step4, vcs-type=git) Oct 14 04:42:13 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:42:15 localhost systemd[1]: tmp-crun.1XB9Yv.mount: Deactivated successfully. Oct 14 04:42:15 localhost podman[87704]: 2025-10-14 08:42:15.761100469 +0000 UTC m=+0.094997111 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:48:37, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=) Oct 14 04:42:15 localhost systemd[1]: tmp-crun.aVxY0b.mount: Deactivated successfully. Oct 14 04:42:15 localhost podman[87703]: 2025-10-14 08:42:15.803294685 +0000 UTC m=+0.137335441 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:42:15 localhost podman[87704]: 2025-10-14 08:42:15.816485333 +0000 UTC m=+0.150381915 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, version=17.1.9, release=1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Oct 14 04:42:15 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:42:15 localhost podman[87702]: 2025-10-14 08:42:15.862844258 +0000 UTC m=+0.200695823 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:42:15 localhost podman[87703]: 2025-10-14 08:42:15.870866846 +0000 UTC m=+0.204907662 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc.) Oct 14 04:42:15 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:42:15 localhost podman[87702]: 2025-10-14 08:42:15.912059801 +0000 UTC m=+0.249911406 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:42:15 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:42:16 localhost sshd[87776]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:42:17 localhost sshd[87777]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:42:17 localhost sshd[87778]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:42:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:42:21 localhost systemd[84643]: Created slice User Background Tasks Slice. Oct 14 04:42:21 localhost systemd[1]: tmp-crun.HJnbVB.mount: Deactivated successfully. Oct 14 04:42:21 localhost systemd[84643]: Starting Cleanup of User's Temporary Files and Directories... Oct 14 04:42:21 localhost podman[87779]: 2025-10-14 08:42:21.744733024 +0000 UTC m=+0.086347514 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr) Oct 14 04:42:21 localhost systemd[84643]: Finished Cleanup of User's Temporary Files and Directories. Oct 14 04:42:21 localhost podman[87779]: 2025-10-14 08:42:21.962231864 +0000 UTC m=+0.303846414 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, version=17.1.9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20250721.1) Oct 14 04:42:21 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:42:37 localhost sshd[87853]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:42:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:42:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:42:37 localhost podman[87855]: 2025-10-14 08:42:37.756263242 +0000 UTC m=+0.094851017 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, version=17.1.9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, release=2, batch=17.1_20250721.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, maintainer=OpenStack TripleO Team) Oct 14 04:42:37 localhost podman[87856]: 2025-10-14 08:42:37.7994969 +0000 UTC m=+0.137471565 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, release=1, config_id=tripleo_step3, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Oct 14 04:42:37 localhost podman[87855]: 2025-10-14 08:42:37.81953454 +0000 UTC m=+0.158122255 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, tcib_managed=true, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=2, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9) Oct 14 04:42:37 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:42:37 localhost podman[87856]: 2025-10-14 08:42:37.83824669 +0000 UTC m=+0.176221365 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=iscsid, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:42:37 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:42:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:42:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4948 writes, 22K keys, 4948 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4948 writes, 643 syncs, 7.70 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 488 writes, 1736 keys, 488 commit groups, 1.0 writes per commit group, ingest: 2.32 MB, 0.00 MB/s#012Interval WAL: 488 writes, 185 syncs, 2.64 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:42:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:42:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:42:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:42:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:42:43 localhost systemd[1]: tmp-crun.8BO7GC.mount: Deactivated successfully. Oct 14 04:42:43 localhost podman[87896]: 2025-10-14 08:42:43.787643584 +0000 UTC m=+0.122624875 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container) Oct 14 04:42:43 localhost podman[87896]: 2025-10-14 08:42:43.87086523 +0000 UTC m=+0.205846521 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 04:42:43 localhost podman[87894]: 2025-10-14 08:42:43.826903229 +0000 UTC m=+0.168782404 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, batch=17.1_20250721.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:42:43 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:42:43 localhost podman[87902]: 2025-10-14 08:42:43.886130812 +0000 UTC m=+0.217070819 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, release=1, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_compute) Oct 14 04:42:43 localhost podman[87894]: 2025-10-14 08:42:43.910018962 +0000 UTC m=+0.251898137 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:42:43 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:42:43 localhost podman[87902]: 2025-10-14 08:42:43.938122441 +0000 UTC m=+0.269062488 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:42:43 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:42:43 localhost podman[87895]: 2025-10-14 08:42:43.860749296 +0000 UTC m=+0.196519342 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, vcs-type=git) Oct 14 04:42:44 localhost podman[87895]: 2025-10-14 08:42:44.20626655 +0000 UTC m=+0.542036586 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1) Oct 14 04:42:44 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:42:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:42:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5539 writes, 24K keys, 5539 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5539 writes, 757 syncs, 7.32 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 505 writes, 2125 keys, 505 commit groups, 1.0 writes per commit group, ingest: 2.50 MB, 0.00 MB/s#012Interval WAL: 505 writes, 187 syncs, 2.70 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:42:46 localhost systemd[1]: tmp-crun.seu0l8.mount: Deactivated successfully. Oct 14 04:42:46 localhost podman[87991]: 2025-10-14 08:42:46.761878713 +0000 UTC m=+0.095327881 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:42:46 localhost podman[87991]: 2025-10-14 08:42:46.786009401 +0000 UTC m=+0.119458519 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:42:46 localhost systemd[1]: tmp-crun.6cuvNL.mount: Deactivated successfully. Oct 14 04:42:46 localhost podman[87990]: 2025-10-14 08:42:46.79989837 +0000 UTC m=+0.136959699 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1) Oct 14 04:42:46 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:42:46 localhost podman[87990]: 2025-10-14 08:42:46.836154003 +0000 UTC m=+0.173215362 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 04:42:46 localhost podman[87992]: 2025-10-14 08:42:46.844357316 +0000 UTC m=+0.174758680 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, release=1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:42:46 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:42:46 localhost podman[87992]: 2025-10-14 08:42:46.868995758 +0000 UTC m=+0.199397152 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, config_id=tripleo_step5, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:42:46 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:42:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:42:52 localhost podman[88062]: 2025-10-14 08:42:52.741065901 +0000 UTC m=+0.081678179 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Oct 14 04:42:52 localhost podman[88062]: 2025-10-14 08:42:52.951150413 +0000 UTC m=+0.291762701 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, container_name=metrics_qdr, version=17.1.9, release=1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:42:52 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:43:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:43:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:43:08 localhost systemd[1]: tmp-crun.6QMnNM.mount: Deactivated successfully. Oct 14 04:43:08 localhost podman[88167]: 2025-10-14 08:43:08.746376673 +0000 UTC m=+0.081204725 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, container_name=collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, release=2, distribution-scope=public, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:43:08 localhost podman[88167]: 2025-10-14 08:43:08.755867076 +0000 UTC m=+0.090695148 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, container_name=collectd, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Oct 14 04:43:08 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:43:08 localhost systemd[1]: tmp-crun.uFITlz.mount: Deactivated successfully. Oct 14 04:43:08 localhost podman[88168]: 2025-10-14 08:43:08.853729915 +0000 UTC m=+0.186451492 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, version=17.1.9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 04:43:08 localhost podman[88168]: 2025-10-14 08:43:08.88653561 +0000 UTC m=+0.219257177 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, distribution-scope=public, version=17.1.9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible) Oct 14 04:43:08 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:43:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:43:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:43:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:43:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:43:14 localhost systemd[1]: tmp-crun.91Jq4L.mount: Deactivated successfully. Oct 14 04:43:14 localhost podman[88208]: 2025-10-14 08:43:14.799101246 +0000 UTC m=+0.135016669 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:07:52, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:43:14 localhost podman[88208]: 2025-10-14 08:43:14.810030735 +0000 UTC m=+0.145946158 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64) Oct 14 04:43:14 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:43:14 localhost podman[88207]: 2025-10-14 08:43:14.863221541 +0000 UTC m=+0.198685190 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, architecture=x86_64, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public) Oct 14 04:43:14 localhost podman[88206]: 2025-10-14 08:43:14.764558948 +0000 UTC m=+0.105599470 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public) Oct 14 04:43:14 localhost podman[88206]: 2025-10-14 08:43:14.89388365 +0000 UTC m=+0.234924162 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:43:14 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:43:14 localhost podman[88214]: 2025-10-14 08:43:14.937376696 +0000 UTC m=+0.267985775 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:43:15 localhost podman[88214]: 2025-10-14 08:43:15.00309241 +0000 UTC m=+0.333701499 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute) Oct 14 04:43:15 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:43:15 localhost podman[88207]: 2025-10-14 08:43:15.235216224 +0000 UTC m=+0.570679913 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:43:15 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:43:15 localhost systemd[1]: tmp-crun.A90zB5.mount: Deactivated successfully. Oct 14 04:43:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:43:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:43:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:43:17 localhost systemd[1]: tmp-crun.mm8y9I.mount: Deactivated successfully. Oct 14 04:43:17 localhost podman[88301]: 2025-10-14 08:43:17.802880351 +0000 UTC m=+0.143384750 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, release=1, managed_by=tripleo_ansible) Oct 14 04:43:17 localhost podman[88302]: 2025-10-14 08:43:17.779809136 +0000 UTC m=+0.119364625 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 04:43:17 localhost podman[88303]: 2025-10-14 08:43:17.754623967 +0000 UTC m=+0.097107866 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1) Oct 14 04:43:17 localhost podman[88302]: 2025-10-14 08:43:17.865202509 +0000 UTC m=+0.204758018 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true) Oct 14 04:43:17 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:43:17 localhost podman[88303]: 2025-10-14 08:43:17.884774295 +0000 UTC m=+0.227258154 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public) Oct 14 04:43:17 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:43:17 localhost podman[88301]: 2025-10-14 08:43:17.918005644 +0000 UTC m=+0.258510043 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:43:17 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:43:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:43:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:43:23 localhost recover_tripleo_nova_virtqemud[88381]: 62551 Oct 14 04:43:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:43:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:43:23 localhost podman[88375]: 2025-10-14 08:43:23.754135092 +0000 UTC m=+0.091958777 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public) Oct 14 04:43:23 localhost podman[88375]: 2025-10-14 08:43:23.977846186 +0000 UTC m=+0.315669821 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step1, io.openshift.expose-services=, release=1, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:43:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:43:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:43:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:43:39 localhost systemd[1]: tmp-crun.D34iFQ.mount: Deactivated successfully. Oct 14 04:43:39 localhost podman[88451]: 2025-10-14 08:43:39.742313085 +0000 UTC m=+0.083350481 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=2, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Oct 14 04:43:39 localhost podman[88451]: 2025-10-14 08:43:39.843792046 +0000 UTC m=+0.184829502 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, release=2, architecture=x86_64, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, version=17.1.9) Oct 14 04:43:39 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:43:39 localhost podman[88452]: 2025-10-14 08:43:39.972154058 +0000 UTC m=+0.308597881 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, config_id=tripleo_step3, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, tcib_managed=true, container_name=iscsid, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Oct 14 04:43:40 localhost podman[88452]: 2025-10-14 08:43:40.011242868 +0000 UTC m=+0.347686661 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:43:40 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:43:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:43:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:43:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:43:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:43:45 localhost systemd[1]: tmp-crun.6JnZjV.mount: Deactivated successfully. Oct 14 04:43:45 localhost podman[88491]: 2025-10-14 08:43:45.744817386 +0000 UTC m=+0.089063077 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, release=1, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, vendor=Red Hat, Inc.) Oct 14 04:43:45 localhost podman[88493]: 2025-10-14 08:43:45.765552848 +0000 UTC m=+0.099784189 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:43:45 localhost podman[88491]: 2025-10-14 08:43:45.776977091 +0000 UTC m=+0.121222802 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:43:45 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:43:45 localhost podman[88499]: 2025-10-14 08:43:45.823379088 +0000 UTC m=+0.151530051 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, tcib_managed=true) Oct 14 04:43:45 localhost podman[88493]: 2025-10-14 08:43:45.852284392 +0000 UTC m=+0.186515773 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond) Oct 14 04:43:45 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:43:45 localhost podman[88499]: 2025-10-14 08:43:45.873842469 +0000 UTC m=+0.201993522 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Oct 14 04:43:45 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:43:45 localhost podman[88492]: 2025-10-14 08:43:45.85546513 +0000 UTC m=+0.192469757 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1) Oct 14 04:43:46 localhost podman[88492]: 2025-10-14 08:43:46.187091654 +0000 UTC m=+0.524096271 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute) Oct 14 04:43:46 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:43:46 localhost systemd[1]: tmp-crun.BuDpOP.mount: Deactivated successfully. Oct 14 04:43:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:43:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:43:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:43:48 localhost systemd[1]: tmp-crun.jBeBDj.mount: Deactivated successfully. Oct 14 04:43:48 localhost podman[88586]: 2025-10-14 08:43:48.747797565 +0000 UTC m=+0.086476508 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, io.openshift.expose-services=) Oct 14 04:43:48 localhost podman[88585]: 2025-10-14 08:43:48.786285036 +0000 UTC m=+0.130287294 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Oct 14 04:43:48 localhost podman[88587]: 2025-10-14 08:43:48.796860092 +0000 UTC m=+0.136040980 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, config_id=tripleo_step5, tcib_managed=true) Oct 14 04:43:48 localhost podman[88586]: 2025-10-14 08:43:48.804802178 +0000 UTC m=+0.143481131 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, container_name=ovn_controller) Oct 14 04:43:48 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:43:48 localhost podman[88585]: 2025-10-14 08:43:48.828065018 +0000 UTC m=+0.172067296 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, distribution-scope=public, config_id=tripleo_step4, version=17.1.9) Oct 14 04:43:48 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:43:48 localhost podman[88587]: 2025-10-14 08:43:48.850283636 +0000 UTC m=+0.189464634 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, distribution-scope=public, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible) Oct 14 04:43:48 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:43:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:43:54 localhost systemd[1]: tmp-crun.Ca7f3p.mount: Deactivated successfully. Oct 14 04:43:54 localhost podman[88656]: 2025-10-14 08:43:54.72143979 +0000 UTC m=+0.063119424 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:43:54 localhost podman[88656]: 2025-10-14 08:43:54.9347136 +0000 UTC m=+0.276393294 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, release=1, batch=17.1_20250721.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, version=17.1.9, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12) Oct 14 04:43:54 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:44:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:44:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:44:10 localhost podman[88814]: 2025-10-14 08:44:10.741219834 +0000 UTC m=+0.080332377 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, container_name=collectd, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, io.openshift.expose-services=, build-date=2025-07-21T13:04:03) Oct 14 04:44:10 localhost podman[88814]: 2025-10-14 08:44:10.757979263 +0000 UTC m=+0.097091786 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:44:10 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:44:10 localhost podman[88815]: 2025-10-14 08:44:10.800090006 +0000 UTC m=+0.138597831 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:44:10 localhost podman[88815]: 2025-10-14 08:44:10.812043236 +0000 UTC m=+0.150551091 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20250721.1, release=1, config_id=tripleo_step3, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc.) Oct 14 04:44:10 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:44:16 localhost systemd[1]: tmp-crun.esxOcE.mount: Deactivated successfully. Oct 14 04:44:16 localhost podman[88852]: 2025-10-14 08:44:16.74721036 +0000 UTC m=+0.091189183 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public) Oct 14 04:44:16 localhost podman[88852]: 2025-10-14 08:44:16.775107594 +0000 UTC m=+0.119086407 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:44:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:44:16 localhost podman[88855]: 2025-10-14 08:44:16.794732661 +0000 UTC m=+0.132900855 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T14:45:33, distribution-scope=public, version=17.1.9, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vcs-type=git) Oct 14 04:44:16 localhost podman[88854]: 2025-10-14 08:44:16.836626348 +0000 UTC m=+0.172483520 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, io.openshift.expose-services=, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, distribution-scope=public) Oct 14 04:44:16 localhost podman[88854]: 2025-10-14 08:44:16.847923847 +0000 UTC m=+0.183781019 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, release=1, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:44:16 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:44:16 localhost podman[88855]: 2025-10-14 08:44:16.898862724 +0000 UTC m=+0.237031028 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:44:16 localhost podman[88853]: 2025-10-14 08:44:16.913179427 +0000 UTC m=+0.254660243 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:44:16 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:44:17 localhost podman[88853]: 2025-10-14 08:44:17.244903113 +0000 UTC m=+0.586383889 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible) Oct 14 04:44:17 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:44:17 localhost systemd[1]: tmp-crun.8w8VpX.mount: Deactivated successfully. Oct 14 04:44:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:44:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:44:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:44:19 localhost systemd[1]: tmp-crun.WITCf1.mount: Deactivated successfully. Oct 14 04:44:19 localhost podman[88947]: 2025-10-14 08:44:19.741454828 +0000 UTC m=+0.082840985 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:44:19 localhost podman[88948]: 2025-10-14 08:44:19.802629811 +0000 UTC m=+0.137865987 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, config_id=tripleo_step5, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Oct 14 04:44:19 localhost podman[88946]: 2025-10-14 08:44:19.840102891 +0000 UTC m=+0.181787157 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:44:19 localhost podman[88947]: 2025-10-14 08:44:19.86817437 +0000 UTC m=+0.209560507 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, container_name=ovn_controller, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, release=1, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:44:19 localhost podman[88946]: 2025-10-14 08:44:19.868747378 +0000 UTC m=+0.210431734 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, version=17.1.9, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Oct 14 04:44:19 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:44:19 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:44:19 localhost podman[88948]: 2025-10-14 08:44:19.977149953 +0000 UTC m=+0.312386169 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 04:44:19 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:44:20 localhost systemd[1]: tmp-crun.4ABhg8.mount: Deactivated successfully. Oct 14 04:44:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:44:25 localhost systemd[1]: tmp-crun.b1AYkx.mount: Deactivated successfully. Oct 14 04:44:25 localhost podman[89020]: 2025-10-14 08:44:25.737362774 +0000 UTC m=+0.076188288 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, config_id=tripleo_step1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, managed_by=tripleo_ansible) Oct 14 04:44:25 localhost podman[89020]: 2025-10-14 08:44:25.932289297 +0000 UTC m=+0.271114801 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Oct 14 04:44:25 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:44:37 localhost sshd[89093]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:44:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:44:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:44:41 localhost systemd[1]: tmp-crun.Hxiczx.mount: Deactivated successfully. Oct 14 04:44:41 localhost podman[89094]: 2025-10-14 08:44:41.739375784 +0000 UTC m=+0.082793834 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:44:41 localhost podman[89094]: 2025-10-14 08:44:41.746701832 +0000 UTC m=+0.090119882 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.33.12, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, release=2, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Oct 14 04:44:41 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:44:41 localhost podman[89095]: 2025-10-14 08:44:41.794277714 +0000 UTC m=+0.132888754 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid) Oct 14 04:44:41 localhost podman[89095]: 2025-10-14 08:44:41.802629112 +0000 UTC m=+0.141240242 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1) Oct 14 04:44:41 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:44:47 localhost systemd[1]: tmp-crun.LqmiJz.mount: Deactivated successfully. Oct 14 04:44:47 localhost podman[89138]: 2025-10-14 08:44:47.812697985 +0000 UTC m=+0.144287057 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12) Oct 14 04:44:47 localhost podman[89136]: 2025-10-14 08:44:47.767074563 +0000 UTC m=+0.103635528 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, vcs-type=git) Oct 14 04:44:47 localhost podman[89137]: 2025-10-14 08:44:47.897005914 +0000 UTC m=+0.229641228 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, release=1, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:44:47 localhost podman[89137]: 2025-10-14 08:44:47.958171387 +0000 UTC m=+0.290806741 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:07:52, distribution-scope=public, architecture=x86_64, tcib_managed=true, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:44:47 localhost podman[89135]: 2025-10-14 08:44:47.819506335 +0000 UTC m=+0.159954651 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:44:47 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:44:48 localhost podman[89135]: 2025-10-14 08:44:48.004342116 +0000 UTC m=+0.344790462 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., release=1, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true) Oct 14 04:44:48 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:44:48 localhost podman[89138]: 2025-10-14 08:44:48.061538437 +0000 UTC m=+0.393127519 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, version=17.1.9, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.) Oct 14 04:44:48 localhost podman[89136]: 2025-10-14 08:44:48.098105018 +0000 UTC m=+0.434665973 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, build-date=2025-07-21T14:48:37) Oct 14 04:44:48 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:44:48 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:44:48 localhost systemd[1]: tmp-crun.S416Rp.mount: Deactivated successfully. Oct 14 04:44:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:44:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:44:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:44:50 localhost systemd[1]: tmp-crun.bWYmSc.mount: Deactivated successfully. Oct 14 04:44:50 localhost podman[89232]: 2025-10-14 08:44:50.737514044 +0000 UTC m=+0.075095605 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container) Oct 14 04:44:50 localhost podman[89231]: 2025-10-14 08:44:50.790461163 +0000 UTC m=+0.130346266 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Oct 14 04:44:50 localhost podman[89232]: 2025-10-14 08:44:50.796320574 +0000 UTC m=+0.133902135 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step5, tcib_managed=true, version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, release=1) Oct 14 04:44:50 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:44:50 localhost podman[89230]: 2025-10-14 08:44:50.836485607 +0000 UTC m=+0.178680371 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:44:50 localhost podman[89231]: 2025-10-14 08:44:50.844136484 +0000 UTC m=+0.184021597 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:28:44, container_name=ovn_controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc.) Oct 14 04:44:50 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:44:50 localhost podman[89230]: 2025-10-14 08:44:50.878401394 +0000 UTC m=+0.220596218 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, release=1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:44:50 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:44:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:44:56 localhost systemd[1]: tmp-crun.ln3zGe.mount: Deactivated successfully. Oct 14 04:44:56 localhost podman[89302]: 2025-10-14 08:44:56.757761632 +0000 UTC m=+0.095883988 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, container_name=metrics_qdr, release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true) Oct 14 04:44:56 localhost podman[89302]: 2025-10-14 08:44:56.966150201 +0000 UTC m=+0.304272607 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=) Oct 14 04:44:56 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:45:07 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:45:07 localhost recover_tripleo_nova_virtqemud[89348]: 62551 Oct 14 04:45:07 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:45:07 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:45:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:45:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:45:12 localhost podman[89412]: 2025-10-14 08:45:12.742986551 +0000 UTC m=+0.081840074 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, container_name=collectd, com.redhat.component=openstack-collectd-container, release=2, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Oct 14 04:45:12 localhost podman[89412]: 2025-10-14 08:45:12.75264436 +0000 UTC m=+0.091497863 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container) Oct 14 04:45:12 localhost podman[89413]: 2025-10-14 08:45:12.792083811 +0000 UTC m=+0.128081606 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step3) Oct 14 04:45:12 localhost podman[89413]: 2025-10-14 08:45:12.801231554 +0000 UTC m=+0.137229379 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64) Oct 14 04:45:12 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:45:12 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:45:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:45:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:45:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:45:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:45:18 localhost podman[89453]: 2025-10-14 08:45:18.766746748 +0000 UTC m=+0.103498284 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, distribution-scope=public, release=1, vcs-type=git) Oct 14 04:45:18 localhost podman[89453]: 2025-10-14 08:45:18.900470767 +0000 UTC m=+0.237222333 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:45:18 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:45:18 localhost podman[89456]: 2025-10-14 08:45:18.901445367 +0000 UTC m=+0.225557991 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:45:18 localhost podman[89454]: 2025-10-14 08:45:18.964961953 +0000 UTC m=+0.298980304 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Oct 14 04:45:18 localhost podman[89455]: 2025-10-14 08:45:18.928086522 +0000 UTC m=+0.258100119 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git) Oct 14 04:45:18 localhost podman[89456]: 2025-10-14 08:45:18.98747452 +0000 UTC m=+0.311587144 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, version=17.1.9, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, container_name=ceilometer_agent_compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible) Oct 14 04:45:18 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:45:19 localhost podman[89455]: 2025-10-14 08:45:19.016182378 +0000 UTC m=+0.346195935 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, release=1) Oct 14 04:45:19 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:45:19 localhost podman[89454]: 2025-10-14 08:45:19.345544252 +0000 UTC m=+0.679562563 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:45:19 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:45:19 localhost systemd[1]: tmp-crun.yrQXCw.mount: Deactivated successfully. Oct 14 04:45:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:45:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:45:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:45:21 localhost systemd[1]: tmp-crun.P2TFIT.mount: Deactivated successfully. Oct 14 04:45:21 localhost podman[89550]: 2025-10-14 08:45:21.751169903 +0000 UTC m=+0.088137169 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, architecture=x86_64, release=1, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:45:21 localhost podman[89551]: 2025-10-14 08:45:21.756036633 +0000 UTC m=+0.087164659 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.9, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:45:21 localhost podman[89550]: 2025-10-14 08:45:21.781388168 +0000 UTC m=+0.118355494 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=) Oct 14 04:45:21 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:45:21 localhost podman[89549]: 2025-10-14 08:45:21.8373561 +0000 UTC m=+0.174020737 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, release=1, architecture=x86_64) Oct 14 04:45:21 localhost podman[89551]: 2025-10-14 08:45:21.85933279 +0000 UTC m=+0.190460896 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:45:21 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:45:21 localhost podman[89549]: 2025-10-14 08:45:21.888113661 +0000 UTC m=+0.224778318 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, version=17.1.9, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:45:21 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:45:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:45:27 localhost systemd[1]: tmp-crun.vcdTFe.mount: Deactivated successfully. Oct 14 04:45:27 localhost podman[89646]: 2025-10-14 08:45:27.760527735 +0000 UTC m=+0.104369831 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T13:07:59, distribution-scope=public, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, container_name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:45:27 localhost podman[89646]: 2025-10-14 08:45:27.989218042 +0000 UTC m=+0.333060148 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, container_name=metrics_qdr, vcs-type=git) Oct 14 04:45:28 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:45:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:45:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:45:43 localhost systemd[1]: tmp-crun.R9qQMz.mount: Deactivated successfully. Oct 14 04:45:43 localhost podman[89698]: 2025-10-14 08:45:43.786346021 +0000 UTC m=+0.123899996 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, architecture=x86_64, version=17.1.9, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b) Oct 14 04:45:43 localhost podman[89698]: 2025-10-14 08:45:43.792927534 +0000 UTC m=+0.130481519 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, config_id=tripleo_step3, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container) Oct 14 04:45:43 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:45:43 localhost podman[89699]: 2025-10-14 08:45:43.873730095 +0000 UTC m=+0.207968138 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Oct 14 04:45:43 localhost podman[89699]: 2025-10-14 08:45:43.912178575 +0000 UTC m=+0.246416638 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, io.buildah.version=1.33.12, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:45:43 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:45:49 localhost podman[89739]: 2025-10-14 08:45:49.736762107 +0000 UTC m=+0.071360450 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.9, architecture=x86_64, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:45:49 localhost systemd[1]: tmp-crun.7SuDEY.mount: Deactivated successfully. Oct 14 04:45:49 localhost podman[89740]: 2025-10-14 08:45:49.76077677 +0000 UTC m=+0.089887203 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=) Oct 14 04:45:49 localhost podman[89740]: 2025-10-14 08:45:49.793824823 +0000 UTC m=+0.122935336 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, version=17.1.9, architecture=x86_64, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52) Oct 14 04:45:49 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:45:49 localhost podman[89741]: 2025-10-14 08:45:49.80795344 +0000 UTC m=+0.136356851 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:45:49 localhost podman[89738]: 2025-10-14 08:45:49.858109512 +0000 UTC m=+0.192208630 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:45:49 localhost podman[89738]: 2025-10-14 08:45:49.879828325 +0000 UTC m=+0.213927393 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public) Oct 14 04:45:49 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:45:49 localhost podman[89741]: 2025-10-14 08:45:49.910745591 +0000 UTC m=+0.239149012 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.9, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:45:49 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:45:50 localhost podman[89739]: 2025-10-14 08:45:50.125829267 +0000 UTC m=+0.460427620 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, version=17.1.9, architecture=x86_64) Oct 14 04:45:50 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:45:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:45:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:45:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:45:52 localhost podman[89832]: 2025-10-14 08:45:52.748714284 +0000 UTC m=+0.086652323 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, release=1, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:45:52 localhost podman[89834]: 2025-10-14 08:45:52.813815608 +0000 UTC m=+0.144901656 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step5, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, build-date=2025-07-21T14:48:37, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container) Oct 14 04:45:52 localhost podman[89832]: 2025-10-14 08:45:52.834643262 +0000 UTC m=+0.172581321 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:45:52 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:45:52 localhost podman[89834]: 2025-10-14 08:45:52.861152223 +0000 UTC m=+0.192238281 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, container_name=nova_compute, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:45:52 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:45:52 localhost podman[89833]: 2025-10-14 08:45:52.732653596 +0000 UTC m=+0.068136919 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, release=1, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, container_name=ovn_controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:45:52 localhost podman[89833]: 2025-10-14 08:45:52.912996837 +0000 UTC m=+0.248480160 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team) Oct 14 04:45:52 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:45:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:45:58 localhost podman[89906]: 2025-10-14 08:45:58.743200386 +0000 UTC m=+0.083327040 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, container_name=metrics_qdr, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Oct 14 04:45:58 localhost podman[89906]: 2025-10-14 08:45:58.9753534 +0000 UTC m=+0.315480014 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, container_name=metrics_qdr, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:45:58 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:46:09 localhost systemd[1]: tmp-crun.cL9Nh2.mount: Deactivated successfully. Oct 14 04:46:09 localhost podman[90037]: 2025-10-14 08:46:09.908805266 +0000 UTC m=+0.086045213 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, release=553) Oct 14 04:46:10 localhost podman[90037]: 2025-10-14 08:46:10.036189499 +0000 UTC m=+0.213429436 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, release=553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , version=7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=) Oct 14 04:46:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:46:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:46:14 localhost podman[90186]: 2025-10-14 08:46:14.73786106 +0000 UTC m=+0.072479335 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team) Oct 14 04:46:14 localhost podman[90186]: 2025-10-14 08:46:14.755134124 +0000 UTC m=+0.089752429 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, version=17.1.9, batch=17.1_20250721.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible) Oct 14 04:46:14 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:46:14 localhost systemd[1]: tmp-crun.8I4my9.mount: Deactivated successfully. Oct 14 04:46:14 localhost podman[90185]: 2025-10-14 08:46:14.848124692 +0000 UTC m=+0.182212581 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 04:46:14 localhost podman[90185]: 2025-10-14 08:46:14.861211737 +0000 UTC m=+0.195299696 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12) Oct 14 04:46:14 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:46:20 localhost systemd[1]: tmp-crun.sTVS9W.mount: Deactivated successfully. Oct 14 04:46:20 localhost systemd[1]: tmp-crun.tNsXHH.mount: Deactivated successfully. Oct 14 04:46:20 localhost podman[90225]: 2025-10-14 08:46:20.737077548 +0000 UTC m=+0.074455156 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron) Oct 14 04:46:20 localhost podman[90223]: 2025-10-14 08:46:20.73811508 +0000 UTC m=+0.080992968 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team) Oct 14 04:46:20 localhost podman[90225]: 2025-10-14 08:46:20.77141327 +0000 UTC m=+0.108790958 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.9) Oct 14 04:46:20 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:46:20 localhost podman[90226]: 2025-10-14 08:46:20.790334306 +0000 UTC m=+0.126922080 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:46:20 localhost podman[90224]: 2025-10-14 08:46:20.843843142 +0000 UTC m=+0.181822118 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, version=17.1.9, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64) Oct 14 04:46:20 localhost podman[90223]: 2025-10-14 08:46:20.867889506 +0000 UTC m=+0.210767364 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.9, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, distribution-scope=public) Oct 14 04:46:20 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:46:20 localhost podman[90226]: 2025-10-14 08:46:20.920079111 +0000 UTC m=+0.256666875 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public) Oct 14 04:46:20 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:46:21 localhost podman[90224]: 2025-10-14 08:46:21.214041539 +0000 UTC m=+0.552020515 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 04:46:21 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:46:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:46:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:46:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:46:23 localhost systemd[1]: tmp-crun.ecceyK.mount: Deactivated successfully. Oct 14 04:46:23 localhost podman[90319]: 2025-10-14 08:46:23.749599669 +0000 UTC m=+0.088028684 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:46:23 localhost systemd[1]: tmp-crun.bHNAkN.mount: Deactivated successfully. Oct 14 04:46:23 localhost podman[90321]: 2025-10-14 08:46:23.768040341 +0000 UTC m=+0.094884668 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, release=1, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Oct 14 04:46:23 localhost podman[90321]: 2025-10-14 08:46:23.794175609 +0000 UTC m=+0.121019946 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37) Oct 14 04:46:23 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:46:23 localhost podman[90319]: 2025-10-14 08:46:23.84626716 +0000 UTC m=+0.184696185 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, vendor=Red Hat, Inc.) Oct 14 04:46:23 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:46:23 localhost podman[90320]: 2025-10-14 08:46:23.854615489 +0000 UTC m=+0.186552254 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-type=git) Oct 14 04:46:23 localhost podman[90320]: 2025-10-14 08:46:23.937298378 +0000 UTC m=+0.269235143 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 04:46:23 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:46:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:46:29 localhost podman[90414]: 2025-10-14 08:46:29.769191077 +0000 UTC m=+0.109562852 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, architecture=x86_64, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:46:29 localhost podman[90414]: 2025-10-14 08:46:29.976247565 +0000 UTC m=+0.316619320 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, container_name=metrics_qdr, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64) Oct 14 04:46:29 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:46:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:46:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:46:45 localhost podman[90443]: 2025-10-14 08:46:45.741496678 +0000 UTC m=+0.082857984 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, release=2, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, container_name=collectd, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3) Oct 14 04:46:45 localhost podman[90443]: 2025-10-14 08:46:45.752336514 +0000 UTC m=+0.093697830 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, release=2, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:46:45 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:46:45 localhost podman[90444]: 2025-10-14 08:46:45.79196746 +0000 UTC m=+0.131788839 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:46:45 localhost podman[90444]: 2025-10-14 08:46:45.828194212 +0000 UTC m=+0.168015591 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true) Oct 14 04:46:45 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:46:51 localhost systemd[1]: tmp-crun.9uZkYo.mount: Deactivated successfully. Oct 14 04:46:51 localhost podman[90481]: 2025-10-14 08:46:51.751925074 +0000 UTC m=+0.090941796 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git) Oct 14 04:46:51 localhost podman[90481]: 2025-10-14 08:46:51.838104022 +0000 UTC m=+0.177120744 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, version=17.1.9, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:46:51 localhost podman[90488]: 2025-10-14 08:46:51.85584605 +0000 UTC m=+0.181914791 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, release=1, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, container_name=logrotate_crond, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Oct 14 04:46:51 localhost podman[90489]: 2025-10-14 08:46:51.778761184 +0000 UTC m=+0.104208266 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1) Oct 14 04:46:51 localhost podman[90488]: 2025-10-14 08:46:51.890003417 +0000 UTC m=+0.216072138 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, name=rhosp17/openstack-cron, release=1, managed_by=tripleo_ansible, version=17.1.9, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:46:51 localhost podman[90482]: 2025-10-14 08:46:51.836413309 +0000 UTC m=+0.167627419 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, container_name=nova_migration_target, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:46:51 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:46:51 localhost podman[90489]: 2025-10-14 08:46:51.912111602 +0000 UTC m=+0.237558724 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1) Oct 14 04:46:51 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:46:51 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:46:52 localhost podman[90482]: 2025-10-14 08:46:52.230232317 +0000 UTC m=+0.561446507 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git) Oct 14 04:46:52 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:46:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:46:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:46:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:46:54 localhost podman[90573]: 2025-10-14 08:46:54.743992684 +0000 UTC m=+0.081345718 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, vcs-type=git, config_id=tripleo_step4, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:46:54 localhost podman[90573]: 2025-10-14 08:46:54.786467178 +0000 UTC m=+0.123820242 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, version=17.1.9) Oct 14 04:46:54 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:46:54 localhost systemd[1]: tmp-crun.34PCYv.mount: Deactivated successfully. Oct 14 04:46:54 localhost podman[90575]: 2025-10-14 08:46:54.849188489 +0000 UTC m=+0.182152588 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Oct 14 04:46:54 localhost podman[90575]: 2025-10-14 08:46:54.883081789 +0000 UTC m=+0.216045898 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Oct 14 04:46:54 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:46:54 localhost podman[90574]: 2025-10-14 08:46:54.79718243 +0000 UTC m=+0.132906935 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, container_name=ovn_controller, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:46:54 localhost podman[90574]: 2025-10-14 08:46:54.929126513 +0000 UTC m=+0.264850918 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, distribution-scope=public, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12) Oct 14 04:46:54 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:47:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:47:00 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:47:00 localhost recover_tripleo_nova_virtqemud[90650]: 62551 Oct 14 04:47:00 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:47:00 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:47:00 localhost systemd[1]: tmp-crun.qv7Pyn.mount: Deactivated successfully. Oct 14 04:47:00 localhost podman[90643]: 2025-10-14 08:47:00.768955569 +0000 UTC m=+0.102208775 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1) Oct 14 04:47:00 localhost podman[90643]: 2025-10-14 08:47:00.963224961 +0000 UTC m=+0.296478187 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr) Oct 14 04:47:00 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:47:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:47:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:47:16 localhost podman[90752]: 2025-10-14 08:47:16.763250743 +0000 UTC m=+0.092199764 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid) Oct 14 04:47:16 localhost podman[90752]: 2025-10-14 08:47:16.802974992 +0000 UTC m=+0.131924013 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, release=1, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Oct 14 04:47:16 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:47:16 localhost podman[90751]: 2025-10-14 08:47:16.80675639 +0000 UTC m=+0.137303940 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, tcib_managed=true, name=rhosp17/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, release=2, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, vcs-type=git, version=17.1.9, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:47:16 localhost podman[90751]: 2025-10-14 08:47:16.891227934 +0000 UTC m=+0.221775554 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, version=17.1.9, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, release=2) Oct 14 04:47:16 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:47:22 localhost systemd[1]: tmp-crun.52EAoU.mount: Deactivated successfully. Oct 14 04:47:22 localhost podman[90790]: 2025-10-14 08:47:22.733883876 +0000 UTC m=+0.078220502 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:47:22 localhost podman[90792]: 2025-10-14 08:47:22.808831895 +0000 UTC m=+0.141693586 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, release=1, tcib_managed=true, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:47:22 localhost podman[90798]: 2025-10-14 08:47:22.762110079 +0000 UTC m=+0.093561726 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc.) Oct 14 04:47:22 localhost podman[90798]: 2025-10-14 08:47:22.84290603 +0000 UTC m=+0.174357687 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, release=1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4) Oct 14 04:47:22 localhost podman[90791]: 2025-10-14 08:47:22.789750185 +0000 UTC m=+0.130257062 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, architecture=x86_64, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, version=17.1.9, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Oct 14 04:47:22 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:47:22 localhost podman[90790]: 2025-10-14 08:47:22.865466218 +0000 UTC m=+0.209802864 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1, build-date=2025-07-21T15:29:47, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4) Oct 14 04:47:22 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:47:22 localhost podman[90792]: 2025-10-14 08:47:22.916843519 +0000 UTC m=+0.249705190 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-cron, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Oct 14 04:47:22 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:47:23 localhost podman[90791]: 2025-10-14 08:47:23.210114054 +0000 UTC m=+0.550620931 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, distribution-scope=public, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:47:23 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:47:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:47:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:47:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:47:25 localhost podman[90889]: 2025-10-14 08:47:25.784597062 +0000 UTC m=+0.114358200 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:47:25 localhost podman[90889]: 2025-10-14 08:47:25.815045714 +0000 UTC m=+0.144806802 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Oct 14 04:47:25 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:47:25 localhost systemd[1]: tmp-crun.UZ7Ddx.mount: Deactivated successfully. Oct 14 04:47:25 localhost podman[90887]: 2025-10-14 08:47:25.887071014 +0000 UTC m=+0.223555160 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=ovn_metadata_agent) Oct 14 04:47:25 localhost podman[90887]: 2025-10-14 08:47:25.939482636 +0000 UTC m=+0.275966822 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T16:28:53, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:47:25 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:47:25 localhost podman[90888]: 2025-10-14 08:47:25.94383133 +0000 UTC m=+0.276006182 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:47:26 localhost podman[90888]: 2025-10-14 08:47:26.027714206 +0000 UTC m=+0.359889058 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Oct 14 04:47:26 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:47:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:47:31 localhost podman[90982]: 2025-10-14 08:47:31.778069923 +0000 UTC m=+0.118519499 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, release=1, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12) Oct 14 04:47:31 localhost podman[90982]: 2025-10-14 08:47:31.996352099 +0000 UTC m=+0.336801735 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, release=1, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, version=17.1.9, config_id=tripleo_step1, distribution-scope=public, name=rhosp17/openstack-qdrouterd) Oct 14 04:47:32 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:47:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:47:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:47:47 localhost podman[91012]: 2025-10-14 08:47:47.755950736 +0000 UTC m=+0.091575984 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public) Oct 14 04:47:47 localhost podman[91012]: 2025-10-14 08:47:47.761251921 +0000 UTC m=+0.096877199 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.9, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=iscsid, release=1) Oct 14 04:47:47 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:47:47 localhost podman[91011]: 2025-10-14 08:47:47.811776174 +0000 UTC m=+0.149023283 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, config_id=tripleo_step3, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:47:47 localhost podman[91011]: 2025-10-14 08:47:47.851242426 +0000 UTC m=+0.188489565 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2) Oct 14 04:47:47 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:47:53 localhost systemd[1]: tmp-crun.LNFmQG.mount: Deactivated successfully. Oct 14 04:47:53 localhost systemd[1]: tmp-crun.9Axmme.mount: Deactivated successfully. Oct 14 04:47:53 localhost podman[91051]: 2025-10-14 08:47:53.728072745 +0000 UTC m=+0.069998537 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, release=1, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:47:53 localhost podman[91052]: 2025-10-14 08:47:53.769790557 +0000 UTC m=+0.106478537 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, release=1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git) Oct 14 04:47:53 localhost podman[91052]: 2025-10-14 08:47:53.853125936 +0000 UTC m=+0.189813906 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:47:53 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:47:53 localhost podman[91053]: 2025-10-14 08:47:53.867995356 +0000 UTC m=+0.202042774 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ceilometer_agent_compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.9) Oct 14 04:47:53 localhost podman[91053]: 2025-10-14 08:47:53.90206841 +0000 UTC m=+0.236115878 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Oct 14 04:47:53 localhost podman[91050]: 2025-10-14 08:47:53.914772704 +0000 UTC m=+0.257018666 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:47:53 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:47:53 localhost podman[91050]: 2025-10-14 08:47:53.943320207 +0000 UTC m=+0.285566199 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:47:53 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:47:54 localhost podman[91051]: 2025-10-14 08:47:54.127171008 +0000 UTC m=+0.469096840 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true) Oct 14 04:47:54 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:47:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:47:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:47:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:47:56 localhost systemd[1]: tmp-crun.0xZnWU.mount: Deactivated successfully. Oct 14 04:47:56 localhost podman[91146]: 2025-10-14 08:47:56.762437916 +0000 UTC m=+0.096674882 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, version=17.1.9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 04:47:56 localhost podman[91146]: 2025-10-14 08:47:56.788016928 +0000 UTC m=+0.122253874 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:47:56 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:47:56 localhost podman[91145]: 2025-10-14 08:47:56.803156127 +0000 UTC m=+0.138513788 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, distribution-scope=public, version=17.1.9) Oct 14 04:47:56 localhost podman[91145]: 2025-10-14 08:47:56.847281203 +0000 UTC m=+0.182638904 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, container_name=ovn_metadata_agent) Oct 14 04:47:56 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:47:56 localhost podman[91147]: 2025-10-14 08:47:56.870840602 +0000 UTC m=+0.201925151 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:47:56 localhost podman[91147]: 2025-10-14 08:47:56.900023544 +0000 UTC m=+0.231108083 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, vcs-type=git, build-date=2025-07-21T14:48:37, container_name=nova_compute, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 04:47:56 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:48:00 localhost sshd[91216]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:48:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:48:02 localhost podman[91217]: 2025-10-14 08:48:02.75274126 +0000 UTC m=+0.088025546 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc.) Oct 14 04:48:02 localhost podman[91217]: 2025-10-14 08:48:02.984207453 +0000 UTC m=+0.319491769 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Oct 14 04:48:02 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:48:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:48:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:48:18 localhost systemd[1]: tmp-crun.sFznjv.mount: Deactivated successfully. Oct 14 04:48:18 localhost podman[91326]: 2025-10-14 08:48:18.771637723 +0000 UTC m=+0.104163254 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:48:18 localhost podman[91326]: 2025-10-14 08:48:18.78799926 +0000 UTC m=+0.120524791 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, release=1) Oct 14 04:48:18 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:48:18 localhost podman[91325]: 2025-10-14 08:48:18.873138164 +0000 UTC m=+0.206038457 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:48:18 localhost podman[91325]: 2025-10-14 08:48:18.883999271 +0000 UTC m=+0.216899554 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.33.12) Oct 14 04:48:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:48:19 localhost systemd[1]: tmp-crun.Hh5wrb.mount: Deactivated successfully. Oct 14 04:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:48:24 localhost systemd[1]: tmp-crun.CRX5XF.mount: Deactivated successfully. Oct 14 04:48:24 localhost podman[91365]: 2025-10-14 08:48:24.813649664 +0000 UTC m=+0.151266642 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, version=17.1.9, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:48:24 localhost podman[91363]: 2025-10-14 08:48:24.82255129 +0000 UTC m=+0.165674249 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible) Oct 14 04:48:24 localhost podman[91366]: 2025-10-14 08:48:24.828204025 +0000 UTC m=+0.155340119 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1) Oct 14 04:48:24 localhost podman[91365]: 2025-10-14 08:48:24.849950618 +0000 UTC m=+0.187567576 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=) Oct 14 04:48:24 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:48:24 localhost podman[91364]: 2025-10-14 08:48:24.894585128 +0000 UTC m=+0.231674250 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.expose-services=) Oct 14 04:48:24 localhost podman[91363]: 2025-10-14 08:48:24.927533638 +0000 UTC m=+0.270656597 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:48:24 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:48:24 localhost podman[91366]: 2025-10-14 08:48:24.977950159 +0000 UTC m=+0.305086273 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1) Oct 14 04:48:24 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:48:25 localhost podman[91364]: 2025-10-14 08:48:25.237993617 +0000 UTC m=+0.575082729 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37) Oct 14 04:48:25 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:48:25 localhost systemd[1]: tmp-crun.FmoJ4W.mount: Deactivated successfully. Oct 14 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:48:27 localhost systemd[1]: tmp-crun.EJTfmI.mount: Deactivated successfully. Oct 14 04:48:27 localhost podman[91460]: 2025-10-14 08:48:27.784282761 +0000 UTC m=+0.107784146 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, release=1, tcib_managed=true, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Oct 14 04:48:27 localhost podman[91458]: 2025-10-14 08:48:27.745393037 +0000 UTC m=+0.080359188 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, version=17.1.9, tcib_managed=true) Oct 14 04:48:27 localhost podman[91460]: 2025-10-14 08:48:27.816996334 +0000 UTC m=+0.140497739 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, version=17.1.9, release=1, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true) Oct 14 04:48:27 localhost podman[91458]: 2025-10-14 08:48:27.829464909 +0000 UTC m=+0.164431030 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, io.buildah.version=1.33.12, batch=17.1_20250721.1, release=1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:48:27 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:48:27 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:48:27 localhost podman[91459]: 2025-10-14 08:48:27.914713958 +0000 UTC m=+0.245543981 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, build-date=2025-07-21T13:28:44) Oct 14 04:48:27 localhost podman[91459]: 2025-10-14 08:48:27.962207538 +0000 UTC m=+0.293037591 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Oct 14 04:48:27 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:48:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:48:33 localhost podman[91531]: 2025-10-14 08:48:33.756635139 +0000 UTC m=+0.094648000 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.9, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64) Oct 14 04:48:33 localhost podman[91531]: 2025-10-14 08:48:33.986270306 +0000 UTC m=+0.324283157 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, release=1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:48:34 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:48:47 localhost sshd[91561]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:48:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:48:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:48:49 localhost podman[91562]: 2025-10-14 08:48:49.744100521 +0000 UTC m=+0.083732102 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, container_name=collectd, version=17.1.9, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, io.buildah.version=1.33.12) Oct 14 04:48:49 localhost systemd[1]: tmp-crun.nrRFFe.mount: Deactivated successfully. Oct 14 04:48:49 localhost podman[91563]: 2025-10-14 08:48:49.809550596 +0000 UTC m=+0.147499395 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:48:49 localhost podman[91563]: 2025-10-14 08:48:49.817283226 +0000 UTC m=+0.155231975 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Oct 14 04:48:49 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:48:49 localhost podman[91562]: 2025-10-14 08:48:49.831075863 +0000 UTC m=+0.170707424 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, container_name=collectd, architecture=x86_64) Oct 14 04:48:49 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:48:50 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:48:50 localhost recover_tripleo_nova_virtqemud[91602]: 62551 Oct 14 04:48:50 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:48:50 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:48:52 localhost sshd[91603]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:48:55 localhost systemd[1]: tmp-crun.UIK0gx.mount: Deactivated successfully. Oct 14 04:48:55 localhost podman[91604]: 2025-10-14 08:48:55.773961898 +0000 UTC m=+0.105767753 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, tcib_managed=true) Oct 14 04:48:55 localhost podman[91606]: 2025-10-14 08:48:55.786427205 +0000 UTC m=+0.112721510 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, release=1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12) Oct 14 04:48:55 localhost podman[91606]: 2025-10-14 08:48:55.820381865 +0000 UTC m=+0.146676170 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64) Oct 14 04:48:55 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:48:55 localhost podman[91607]: 2025-10-14 08:48:55.86545836 +0000 UTC m=+0.190446105 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Oct 14 04:48:55 localhost podman[91604]: 2025-10-14 08:48:55.883420846 +0000 UTC m=+0.215226741 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, container_name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=) Oct 14 04:48:55 localhost podman[91607]: 2025-10-14 08:48:55.891195097 +0000 UTC m=+0.216182802 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1) Oct 14 04:48:55 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:48:55 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:48:55 localhost podman[91605]: 2025-10-14 08:48:55.973393801 +0000 UTC m=+0.303606878 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, release=1, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, version=17.1.9, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:48:56 localhost podman[91605]: 2025-10-14 08:48:56.392478371 +0000 UTC m=+0.722691448 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, container_name=nova_migration_target, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Oct 14 04:48:56 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:48:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:48:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:48:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:48:58 localhost podman[91700]: 2025-10-14 08:48:58.742887164 +0000 UTC m=+0.071530424 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.9, architecture=x86_64) Oct 14 04:48:58 localhost podman[91700]: 2025-10-14 08:48:58.766696161 +0000 UTC m=+0.095339301 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:48:58 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:48:58 localhost systemd[1]: tmp-crun.cFfSvX.mount: Deactivated successfully. Oct 14 04:48:58 localhost podman[91698]: 2025-10-14 08:48:58.847190212 +0000 UTC m=+0.182428937 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, release=1, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, tcib_managed=true, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:48:58 localhost podman[91698]: 2025-10-14 08:48:58.880954897 +0000 UTC m=+0.216193612 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, version=17.1.9) Oct 14 04:48:58 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:48:58 localhost podman[91699]: 2025-10-14 08:48:58.895426605 +0000 UTC m=+0.227196692 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container) Oct 14 04:48:58 localhost podman[91699]: 2025-10-14 08:48:58.909708517 +0000 UTC m=+0.241478624 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, container_name=ovn_controller, io.buildah.version=1.33.12, release=1, version=17.1.9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, maintainer=OpenStack TripleO Team) Oct 14 04:48:58 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:48:59 localhost systemd[1]: tmp-crun.3TMBHV.mount: Deactivated successfully. Oct 14 04:49:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:49:04 localhost podman[91768]: 2025-10-14 08:49:04.731850955 +0000 UTC m=+0.072023290 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, release=1, distribution-scope=public, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true) Oct 14 04:49:04 localhost podman[91768]: 2025-10-14 08:49:04.944345042 +0000 UTC m=+0.284517387 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1) Oct 14 04:49:04 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:49:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:49:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:49:20 localhost podman[91874]: 2025-10-14 08:49:20.740818582 +0000 UTC m=+0.073800626 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.buildah.version=1.33.12, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team) Oct 14 04:49:20 localhost podman[91873]: 2025-10-14 08:49:20.793767151 +0000 UTC m=+0.126757575 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=2, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, build-date=2025-07-21T13:04:03, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Oct 14 04:49:20 localhost podman[91874]: 2025-10-14 08:49:20.8234948 +0000 UTC m=+0.156476904 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:49:20 localhost podman[91873]: 2025-10-14 08:49:20.831270841 +0000 UTC m=+0.164261335 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, vcs-type=git, version=17.1.9, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:49:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:49:20 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:49:26 localhost systemd[1]: tmp-crun.5dwkAQ.mount: Deactivated successfully. Oct 14 04:49:26 localhost podman[91916]: 2025-10-14 08:49:26.757999364 +0000 UTC m=+0.091424911 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, tcib_managed=true) Oct 14 04:49:26 localhost podman[91915]: 2025-10-14 08:49:26.727803869 +0000 UTC m=+0.066902841 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, release=1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:49:26 localhost podman[91915]: 2025-10-14 08:49:26.811006294 +0000 UTC m=+0.150105266 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, managed_by=tripleo_ansible) Oct 14 04:49:26 localhost podman[91916]: 2025-10-14 08:49:26.816976629 +0000 UTC m=+0.150402186 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, version=17.1.9, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 04:49:26 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:49:26 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:49:26 localhost podman[91914]: 2025-10-14 08:49:26.857701359 +0000 UTC m=+0.194661325 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, release=1, batch=17.1_20250721.1, container_name=nova_migration_target) Oct 14 04:49:26 localhost podman[91913]: 2025-10-14 08:49:26.81408967 +0000 UTC m=+0.154501653 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9) Oct 14 04:49:26 localhost podman[91913]: 2025-10-14 08:49:26.89907309 +0000 UTC m=+0.239485093 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, tcib_managed=true) Oct 14 04:49:26 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:49:27 localhost podman[91914]: 2025-10-14 08:49:27.224617295 +0000 UTC m=+0.561577291 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute) Oct 14 04:49:27 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:49:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:49:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:49:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:49:29 localhost podman[92007]: 2025-10-14 08:49:29.736519276 +0000 UTC m=+0.075767156 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20250721.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, io.openshift.expose-services=) Oct 14 04:49:29 localhost podman[92007]: 2025-10-14 08:49:29.780135125 +0000 UTC m=+0.119382985 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1) Oct 14 04:49:29 localhost systemd[1]: tmp-crun.ukSIYB.mount: Deactivated successfully. Oct 14 04:49:29 localhost podman[92009]: 2025-10-14 08:49:29.798707031 +0000 UTC m=+0.131086149 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37) Oct 14 04:49:29 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:49:29 localhost podman[92008]: 2025-10-14 08:49:29.844415705 +0000 UTC m=+0.179802146 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12) Oct 14 04:49:29 localhost podman[92009]: 2025-10-14 08:49:29.852031841 +0000 UTC m=+0.184410969 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, name=rhosp17/openstack-nova-compute, version=17.1.9, vcs-type=git, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1) Oct 14 04:49:29 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:49:29 localhost podman[92008]: 2025-10-14 08:49:29.891281406 +0000 UTC m=+0.226667847 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 04:49:29 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:49:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:49:35 localhost systemd[1]: tmp-crun.ZXjGAl.mount: Deactivated successfully. Oct 14 04:49:35 localhost podman[92079]: 2025-10-14 08:49:35.749023906 +0000 UTC m=+0.088355885 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, io.buildah.version=1.33.12, container_name=metrics_qdr, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git) Oct 14 04:49:35 localhost podman[92079]: 2025-10-14 08:49:35.938135729 +0000 UTC m=+0.277467698 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59) Oct 14 04:49:35 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:49:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:49:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:49:51 localhost podman[92108]: 2025-10-14 08:49:51.744534626 +0000 UTC m=+0.090568634 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container) Oct 14 04:49:51 localhost podman[92108]: 2025-10-14 08:49:51.761050097 +0000 UTC m=+0.107084065 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, release=2, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b) Oct 14 04:49:51 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:49:51 localhost systemd[1]: tmp-crun.8XJt0S.mount: Deactivated successfully. Oct 14 04:49:51 localhost podman[92109]: 2025-10-14 08:49:51.848324268 +0000 UTC m=+0.190136145 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, release=1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git) Oct 14 04:49:51 localhost podman[92109]: 2025-10-14 08:49:51.887788269 +0000 UTC m=+0.229600176 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-type=git, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:49:51 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:49:57 localhost systemd[1]: tmp-crun.GA5cXI.mount: Deactivated successfully. Oct 14 04:49:57 localhost podman[92149]: 2025-10-14 08:49:57.753805146 +0000 UTC m=+0.083286509 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T14:45:33) Oct 14 04:49:57 localhost podman[92149]: 2025-10-14 08:49:57.785902489 +0000 UTC m=+0.115383812 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, release=1) Oct 14 04:49:57 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:49:57 localhost podman[92148]: 2025-10-14 08:49:57.805418303 +0000 UTC m=+0.134387980 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, release=1) Oct 14 04:49:57 localhost podman[92148]: 2025-10-14 08:49:57.845305738 +0000 UTC m=+0.174275445 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:49:57 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:49:57 localhost podman[92147]: 2025-10-14 08:49:57.863998746 +0000 UTC m=+0.196885764 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.33.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team) Oct 14 04:49:57 localhost podman[92146]: 2025-10-14 08:49:57.908758391 +0000 UTC m=+0.243593360 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:49:57 localhost podman[92146]: 2025-10-14 08:49:57.966269761 +0000 UTC m=+0.301104660 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:49:57 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:49:58 localhost podman[92147]: 2025-10-14 08:49:58.243306296 +0000 UTC m=+0.576193304 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, release=1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git) Oct 14 04:49:58 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:49:58 localhost systemd[1]: tmp-crun.oVaqot.mount: Deactivated successfully. Oct 14 04:50:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:50:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:50:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:50:00 localhost podman[92239]: 2025-10-14 08:50:00.777585188 +0000 UTC m=+0.065372954 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4) Oct 14 04:50:00 localhost podman[92244]: 2025-10-14 08:50:00.797066081 +0000 UTC m=+0.074977061 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:50:00 localhost podman[92244]: 2025-10-14 08:50:00.850623268 +0000 UTC m=+0.128534248 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, vcs-type=git, io.openshift.expose-services=, version=17.1.9, distribution-scope=public) Oct 14 04:50:00 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:50:00 localhost podman[92239]: 2025-10-14 08:50:00.873730154 +0000 UTC m=+0.161517920 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12) Oct 14 04:50:00 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:50:00 localhost podman[92240]: 2025-10-14 08:50:00.854893701 +0000 UTC m=+0.134938977 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, container_name=ovn_controller, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9) Oct 14 04:50:00 localhost podman[92240]: 2025-10-14 08:50:00.935456425 +0000 UTC m=+0.215501701 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., batch=17.1_20250721.1) Oct 14 04:50:00 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:50:06 localhost podman[92311]: 2025-10-14 08:50:06.748502829 +0000 UTC m=+0.088330035 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:07:59) Oct 14 04:50:06 localhost podman[92311]: 2025-10-14 08:50:06.965015683 +0000 UTC m=+0.304842819 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:07:59, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, release=1, io.openshift.expose-services=, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:50:06 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:50:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:50:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:50:22 localhost podman[92419]: 2025-10-14 08:50:22.74831412 +0000 UTC m=+0.078977818 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=2, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.33.12, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vendor=Red Hat, Inc.) Oct 14 04:50:22 localhost podman[92419]: 2025-10-14 08:50:22.766147091 +0000 UTC m=+0.096810819 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12) Oct 14 04:50:22 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:50:22 localhost podman[92420]: 2025-10-14 08:50:22.858578263 +0000 UTC m=+0.188850799 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid) Oct 14 04:50:22 localhost podman[92420]: 2025-10-14 08:50:22.874660989 +0000 UTC m=+0.204933575 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 04:50:22 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:50:28 localhost systemd[1]: tmp-crun.5k2EgB.mount: Deactivated successfully. Oct 14 04:50:28 localhost podman[92459]: 2025-10-14 08:50:28.736946313 +0000 UTC m=+0.077102181 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:50:28 localhost podman[92467]: 2025-10-14 08:50:28.795750887 +0000 UTC m=+0.125788982 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, release=1, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1) Oct 14 04:50:28 localhost podman[92460]: 2025-10-14 08:50:28.760822649 +0000 UTC m=+0.094883998 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.9, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4) Oct 14 04:50:28 localhost podman[92459]: 2025-10-14 08:50:28.818195 +0000 UTC m=+0.158350838 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, architecture=x86_64) Oct 14 04:50:28 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:50:28 localhost podman[92461]: 2025-10-14 08:50:28.912604674 +0000 UTC m=+0.242847715 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 04:50:28 localhost podman[92467]: 2025-10-14 08:50:28.923285192 +0000 UTC m=+0.253323357 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:50:28 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:50:28 localhost podman[92461]: 2025-10-14 08:50:28.975886556 +0000 UTC m=+0.306129657 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 04:50:28 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:50:29 localhost podman[92460]: 2025-10-14 08:50:29.135177231 +0000 UTC m=+0.469238570 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, tcib_managed=true, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20250721.1, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Oct 14 04:50:29 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:50:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:50:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:50:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:50:31 localhost podman[92552]: 2025-10-14 08:50:31.742551857 +0000 UTC m=+0.076504282 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller) Oct 14 04:50:31 localhost podman[92552]: 2025-10-14 08:50:31.767421415 +0000 UTC m=+0.101373870 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, release=1, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12) Oct 14 04:50:31 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:50:31 localhost podman[92553]: 2025-10-14 08:50:31.81036893 +0000 UTC m=+0.139350401 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64) Oct 14 04:50:31 localhost systemd[1]: tmp-crun.7ZEHY8.mount: Deactivated successfully. Oct 14 04:50:31 localhost podman[92551]: 2025-10-14 08:50:31.851766508 +0000 UTC m=+0.187916910 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:50:31 localhost podman[92553]: 2025-10-14 08:50:31.869234506 +0000 UTC m=+0.198216017 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9) Oct 14 04:50:31 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:50:31 localhost podman[92551]: 2025-10-14 08:50:31.893579868 +0000 UTC m=+0.229730210 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Oct 14 04:50:31 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:50:37 localhost podman[92626]: 2025-10-14 08:50:37.758826512 +0000 UTC m=+0.091379830 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:50:37 localhost podman[92626]: 2025-10-14 08:50:37.945976207 +0000 UTC m=+0.278529595 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, distribution-scope=public) Oct 14 04:50:37 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:50:50 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:50:50 localhost recover_tripleo_nova_virtqemud[92656]: 62551 Oct 14 04:50:50 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:50:50 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:50:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:50:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:50:53 localhost systemd[1]: tmp-crun.1zV7YA.mount: Deactivated successfully. Oct 14 04:50:53 localhost podman[92657]: 2025-10-14 08:50:53.744835873 +0000 UTC m=+0.087840882 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=2, build-date=2025-07-21T13:04:03, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20250721.1) Oct 14 04:50:53 localhost podman[92658]: 2025-10-14 08:50:53.781834315 +0000 UTC m=+0.121290214 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T13:27:15, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid) Oct 14 04:50:53 localhost podman[92657]: 2025-10-14 08:50:53.811885712 +0000 UTC m=+0.154890731 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=2, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 04:50:53 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:50:53 localhost podman[92658]: 2025-10-14 08:50:53.86595293 +0000 UTC m=+0.205408859 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, architecture=x86_64, tcib_managed=true, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:50:53 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:50:59 localhost systemd[1]: tmp-crun.tO3YiR.mount: Deactivated successfully. Oct 14 04:50:59 localhost podman[92696]: 2025-10-14 08:50:59.739742938 +0000 UTC m=+0.076366838 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, vcs-type=git, release=1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.33.12) Oct 14 04:50:59 localhost podman[92696]: 2025-10-14 08:50:59.767522115 +0000 UTC m=+0.104146015 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, tcib_managed=true, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git) Oct 14 04:50:59 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:50:59 localhost podman[92698]: 2025-10-14 08:50:59.819695154 +0000 UTC m=+0.148862704 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, release=1, name=rhosp17/openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:50:59 localhost podman[92699]: 2025-10-14 08:50:59.822627225 +0000 UTC m=+0.148624327 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.buildah.version=1.33.12) Oct 14 04:50:59 localhost podman[92698]: 2025-10-14 08:50:59.827841976 +0000 UTC m=+0.157009526 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git) Oct 14 04:50:59 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:50:59 localhost podman[92699]: 2025-10-14 08:50:59.853962612 +0000 UTC m=+0.179959754 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:50:59 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:50:59 localhost podman[92697]: 2025-10-14 08:50:59.909711072 +0000 UTC m=+0.242289797 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, tcib_managed=true, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, distribution-scope=public, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 04:51:00 localhost podman[92697]: 2025-10-14 08:51:00.206295244 +0000 UTC m=+0.538873909 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public) Oct 14 04:51:00 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:51:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:51:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:51:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:51:02 localhost systemd[1]: tmp-crun.rCw5fY.mount: Deactivated successfully. Oct 14 04:51:02 localhost podman[92793]: 2025-10-14 08:51:02.742522705 +0000 UTC m=+0.076263965 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1) Oct 14 04:51:02 localhost podman[92795]: 2025-10-14 08:51:02.762502321 +0000 UTC m=+0.089745710 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, distribution-scope=public, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 04:51:02 localhost podman[92794]: 2025-10-14 08:51:02.806936442 +0000 UTC m=+0.140960201 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 04:51:02 localhost podman[92793]: 2025-10-14 08:51:02.827067093 +0000 UTC m=+0.160808363 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:51:02 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:51:02 localhost podman[92795]: 2025-10-14 08:51:02.847629007 +0000 UTC m=+0.174872386 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, version=17.1.9, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12) Oct 14 04:51:02 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:51:02 localhost podman[92794]: 2025-10-14 08:51:02.90212719 +0000 UTC m=+0.236150879 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 04:51:02 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:51:03 localhost systemd[1]: tmp-crun.DFvvZT.mount: Deactivated successfully. Oct 14 04:51:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:51:08 localhost podman[92866]: 2025-10-14 08:51:08.728100901 +0000 UTC m=+0.074438618 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.33.12, release=1) Oct 14 04:51:08 localhost podman[92866]: 2025-10-14 08:51:08.906408553 +0000 UTC m=+0.252746230 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.33.12, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, architecture=x86_64, version=17.1.9, name=rhosp17/openstack-qdrouterd) Oct 14 04:51:08 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:51:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:51:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:51:24 localhost systemd[1]: tmp-crun.RhCXGp.mount: Deactivated successfully. Oct 14 04:51:24 localhost podman[92974]: 2025-10-14 08:51:24.749420242 +0000 UTC m=+0.092346410 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:51:24 localhost podman[92974]: 2025-10-14 08:51:24.761335831 +0000 UTC m=+0.104261969 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Oct 14 04:51:24 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:51:24 localhost podman[92973]: 2025-10-14 08:51:24.856726294 +0000 UTC m=+0.198649171 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, name=rhosp17/openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:51:24 localhost podman[92973]: 2025-10-14 08:51:24.870082985 +0000 UTC m=+0.212005882 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:51:24 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:51:30 localhost podman[93013]: 2025-10-14 08:51:30.738947542 +0000 UTC m=+0.075818321 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=) Oct 14 04:51:30 localhost systemd[1]: tmp-crun.m7JiI4.mount: Deactivated successfully. Oct 14 04:51:30 localhost podman[93017]: 2025-10-14 08:51:30.793225247 +0000 UTC m=+0.121930644 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-07-21T14:45:33, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12) Oct 14 04:51:30 localhost podman[93014]: 2025-10-14 08:51:30.798115768 +0000 UTC m=+0.129495637 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:51:30 localhost podman[93015]: 2025-10-14 08:51:30.847754189 +0000 UTC m=+0.180438088 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, release=1, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, batch=17.1_20250721.1, tcib_managed=true, name=rhosp17/openstack-cron, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc.) Oct 14 04:51:30 localhost podman[93017]: 2025-10-14 08:51:30.852997101 +0000 UTC m=+0.181702448 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-type=git, release=1, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 04:51:30 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:51:30 localhost podman[93013]: 2025-10-14 08:51:30.867613712 +0000 UTC m=+0.204484321 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, architecture=x86_64, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Oct 14 04:51:30 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:51:30 localhost podman[93015]: 2025-10-14 08:51:30.883921145 +0000 UTC m=+0.216605034 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, architecture=x86_64, container_name=logrotate_crond) Oct 14 04:51:30 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:51:31 localhost podman[93014]: 2025-10-14 08:51:31.150118499 +0000 UTC m=+0.481498388 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_migration_target, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Oct 14 04:51:31 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:51:31 localhost systemd[1]: tmp-crun.bC78wd.mount: Deactivated successfully. Oct 14 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:51:33 localhost systemd[1]: tmp-crun.we7cnL.mount: Deactivated successfully. Oct 14 04:51:33 localhost podman[93103]: 2025-10-14 08:51:33.739312874 +0000 UTC m=+0.078495903 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53) Oct 14 04:51:33 localhost podman[93109]: 2025-10-14 08:51:33.751093098 +0000 UTC m=+0.078024629 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, release=1, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.9, architecture=x86_64, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:51:33 localhost podman[93104]: 2025-10-14 08:51:33.78585383 +0000 UTC m=+0.118188187 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:51:33 localhost podman[93103]: 2025-10-14 08:51:33.805015022 +0000 UTC m=+0.144198011 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T16:28:53, release=1, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:51:33 localhost podman[93109]: 2025-10-14 08:51:33.807086856 +0000 UTC m=+0.134018387 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 04:51:33 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:51:33 localhost podman[93104]: 2025-10-14 08:51:33.858325237 +0000 UTC m=+0.190659634 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, distribution-scope=public, release=1, architecture=x86_64, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:51:33 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Deactivated successfully. Oct 14 04:51:33 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:51:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:51:39 localhost podman[93174]: 2025-10-14 08:51:39.726755291 +0000 UTC m=+0.064712548 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:51:39 localhost podman[93174]: 2025-10-14 08:51:39.897385096 +0000 UTC m=+0.235342413 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, version=17.1.9, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 04:51:39 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:51:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:51:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:51:55 localhost systemd[1]: tmp-crun.IPD0OT.mount: Deactivated successfully. Oct 14 04:51:55 localhost podman[93204]: 2025-10-14 08:51:55.735055131 +0000 UTC m=+0.073931132 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.33.12) Oct 14 04:51:55 localhost podman[93204]: 2025-10-14 08:51:55.747070302 +0000 UTC m=+0.085946383 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Oct 14 04:51:55 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:51:55 localhost podman[93203]: 2025-10-14 08:51:55.785011923 +0000 UTC m=+0.123692658 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, container_name=collectd, batch=17.1_20250721.1, io.openshift.expose-services=, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:51:55 localhost podman[93203]: 2025-10-14 08:51:55.791434631 +0000 UTC m=+0.130115356 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, name=rhosp17/openstack-collectd, distribution-scope=public, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:51:55 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:52:00 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:52:00 localhost recover_tripleo_nova_virtqemud[93242]: 62551 Oct 14 04:52:00 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:52:00 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:52:01 localhost systemd[1]: tmp-crun.IyUfHj.mount: Deactivated successfully. Oct 14 04:52:01 localhost podman[93244]: 2025-10-14 08:52:01.739021046 +0000 UTC m=+0.071562519 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, managed_by=tripleo_ansible, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:52:01 localhost systemd[1]: tmp-crun.iKTLoc.mount: Deactivated successfully. Oct 14 04:52:01 localhost podman[93243]: 2025-10-14 08:52:01.79164609 +0000 UTC m=+0.126345340 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public) Oct 14 04:52:01 localhost podman[93245]: 2025-10-14 08:52:01.824579786 +0000 UTC m=+0.147789311 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git) Oct 14 04:52:01 localhost podman[93253]: 2025-10-14 08:52:01.848882107 +0000 UTC m=+0.171375980 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, container_name=ceilometer_agent_compute, release=1, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20250721.1, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:52:01 localhost podman[93243]: 2025-10-14 08:52:01.8781601 +0000 UTC m=+0.212859360 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4) Oct 14 04:52:01 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:52:01 localhost podman[93245]: 2025-10-14 08:52:01.903260284 +0000 UTC m=+0.226469789 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1) Oct 14 04:52:01 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:52:01 localhost podman[93253]: 2025-10-14 08:52:01.931776934 +0000 UTC m=+0.254270787 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12) Oct 14 04:52:01 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:52:02 localhost podman[93244]: 2025-10-14 08:52:02.083085933 +0000 UTC m=+0.415627396 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public) Oct 14 04:52:02 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:52:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:52:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:52:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:52:04 localhost podman[93337]: 2025-10-14 08:52:04.762624316 +0000 UTC m=+0.097532421 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1) Oct 14 04:52:04 localhost systemd[1]: tmp-crun.1oGybb.mount: Deactivated successfully. Oct 14 04:52:04 localhost podman[93339]: 2025-10-14 08:52:04.818242932 +0000 UTC m=+0.148647139 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Oct 14 04:52:04 localhost podman[93337]: 2025-10-14 08:52:04.842575852 +0000 UTC m=+0.177483937 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:52:04 localhost podman[93339]: 2025-10-14 08:52:04.846164223 +0000 UTC m=+0.176568420 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9) Oct 14 04:52:04 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:52:04 localhost podman[93338]: 2025-10-14 08:52:04.864631903 +0000 UTC m=+0.198251409 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:52:04 localhost podman[93338]: 2025-10-14 08:52:04.883442684 +0000 UTC m=+0.217062190 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, architecture=x86_64, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, version=17.1.9) Oct 14 04:52:04 localhost podman[93338]: unhealthy Oct 14 04:52:04 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:52:04 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:52:04 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:52:05 localhost systemd[1]: tmp-crun.83N0Q4.mount: Deactivated successfully. Oct 14 04:52:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:52:10 localhost podman[93411]: 2025-10-14 08:52:10.731002772 +0000 UTC m=+0.073584891 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., batch=17.1_20250721.1, release=1, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:52:10 localhost podman[93411]: 2025-10-14 08:52:10.942105797 +0000 UTC m=+0.284687926 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T13:07:59, tcib_managed=true, name=rhosp17/openstack-qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=) Oct 14 04:52:10 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:52:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:52:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:52:26 localhost systemd[1]: tmp-crun.UXqxQt.mount: Deactivated successfully. Oct 14 04:52:26 localhost podman[93553]: 2025-10-14 08:52:26.776650045 +0000 UTC m=+0.100028748 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T13:27:15, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team) Oct 14 04:52:26 localhost podman[93553]: 2025-10-14 08:52:26.813477031 +0000 UTC m=+0.136855574 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 04:52:26 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:52:26 localhost podman[93552]: 2025-10-14 08:52:26.874097732 +0000 UTC m=+0.202302064 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, architecture=x86_64, release=2, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Oct 14 04:52:26 localhost podman[93552]: 2025-10-14 08:52:26.914065095 +0000 UTC m=+0.242269387 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, release=2, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, build-date=2025-07-21T13:04:03, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, tcib_managed=true) Oct 14 04:52:26 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:52:31 localhost sshd[93607]: main: sshd: ssh-rsa algorithm is disabled Oct 14 04:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:52:32 localhost systemd[1]: tmp-crun.dJiPKR.mount: Deactivated successfully. Oct 14 04:52:32 localhost podman[93609]: 2025-10-14 08:52:32.771019983 +0000 UTC m=+0.108979083 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, architecture=x86_64, container_name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:52:32 localhost podman[93610]: 2025-10-14 08:52:32.801201534 +0000 UTC m=+0.138328219 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:52:32 localhost podman[93611]: 2025-10-14 08:52:32.813872175 +0000 UTC m=+0.147758460 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron) Oct 14 04:52:32 localhost podman[93611]: 2025-10-14 08:52:32.845491701 +0000 UTC m=+0.179377956 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, io.openshift.expose-services=) Oct 14 04:52:32 localhost podman[93609]: 2025-10-14 08:52:32.85905568 +0000 UTC m=+0.197014760 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git) Oct 14 04:52:32 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:52:32 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:52:32 localhost podman[93612]: 2025-10-14 08:52:32.967210697 +0000 UTC m=+0.294573180 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, container_name=ceilometer_agent_compute, release=1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:52:33 localhost podman[93612]: 2025-10-14 08:52:33.013204246 +0000 UTC m=+0.340566679 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-07-21T14:45:33, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, release=1) Oct 14 04:52:33 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:52:33 localhost podman[93610]: 2025-10-14 08:52:33.213069603 +0000 UTC m=+0.550196258 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 04:52:33 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:52:33 localhost systemd[1]: tmp-crun.CVfdSc.mount: Deactivated successfully. Oct 14 04:52:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:52:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:52:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:52:35 localhost systemd[1]: tmp-crun.d62G8M.mount: Deactivated successfully. Oct 14 04:52:35 localhost podman[93703]: 2025-10-14 08:52:35.722531889 +0000 UTC m=+0.068608188 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, version=17.1.9, container_name=ovn_metadata_agent, release=1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=) Oct 14 04:52:35 localhost podman[93705]: 2025-10-14 08:52:35.791710374 +0000 UTC m=+0.131630663 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-07-21T14:48:37, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64) Oct 14 04:52:35 localhost podman[93705]: 2025-10-14 08:52:35.818112248 +0000 UTC m=+0.158032537 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute) Oct 14 04:52:35 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:52:35 localhost podman[93704]: 2025-10-14 08:52:35.838083645 +0000 UTC m=+0.179754788 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1) Oct 14 04:52:35 localhost podman[93703]: 2025-10-14 08:52:35.842503281 +0000 UTC m=+0.188579560 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:52:35 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Deactivated successfully. Oct 14 04:52:35 localhost podman[93704]: 2025-10-14 08:52:35.862138097 +0000 UTC m=+0.203809230 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:52:35 localhost podman[93704]: unhealthy Oct 14 04:52:35 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:52:35 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:52:36 localhost systemd[1]: tmp-crun.1pod6y.mount: Deactivated successfully. Oct 14 04:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4952 writes, 22K keys, 4952 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4952 writes, 645 syncs, 7.68 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4 writes, 22 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:52:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:52:41 localhost systemd[1]: tmp-crun.58DovD.mount: Deactivated successfully. Oct 14 04:52:41 localhost podman[93781]: 2025-10-14 08:52:41.744572453 +0000 UTC m=+0.083062395 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, release=1, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:52:41 localhost podman[93781]: 2025-10-14 08:52:41.981926047 +0000 UTC m=+0.320415929 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr) Oct 14 04:52:41 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 04:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5543 writes, 24K keys, 5543 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5543 writes, 759 syncs, 7.30 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4 writes, 9 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 04:52:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:52:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:52:57 localhost podman[93810]: 2025-10-14 08:52:57.752151531 +0000 UTC m=+0.089073729 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, build-date=2025-07-21T13:27:15, architecture=x86_64) Oct 14 04:52:57 localhost podman[93810]: 2025-10-14 08:52:57.766081761 +0000 UTC m=+0.103003979 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, version=17.1.9, container_name=iscsid) Oct 14 04:52:57 localhost systemd[1]: tmp-crun.TJdtu1.mount: Deactivated successfully. Oct 14 04:52:57 localhost podman[93809]: 2025-10-14 08:52:57.814285608 +0000 UTC m=+0.151239077 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., container_name=collectd, release=2, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:52:57 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:52:57 localhost podman[93809]: 2025-10-14 08:52:57.875877589 +0000 UTC m=+0.212831098 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, maintainer=OpenStack TripleO Team, release=2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20250721.1) Oct 14 04:52:57 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:53:03 localhost systemd[1]: tmp-crun.xyrDoT.mount: Deactivated successfully. Oct 14 04:53:03 localhost podman[93858]: 2025-10-14 08:53:03.799907637 +0000 UTC m=+0.117453485 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, config_id=tripleo_step4) Oct 14 04:53:03 localhost podman[93850]: 2025-10-14 08:53:03.756708454 +0000 UTC m=+0.089145131 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T15:29:47, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public) Oct 14 04:53:03 localhost podman[93851]: 2025-10-14 08:53:03.823984891 +0000 UTC m=+0.150507946 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37) Oct 14 04:53:03 localhost podman[93852]: 2025-10-14 08:53:03.86999742 +0000 UTC m=+0.193972426 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-cron-container, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Oct 14 04:53:03 localhost podman[93852]: 2025-10-14 08:53:03.879130091 +0000 UTC m=+0.203105078 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, config_id=tripleo_step4, name=rhosp17/openstack-cron) Oct 14 04:53:03 localhost podman[93858]: 2025-10-14 08:53:03.891448932 +0000 UTC m=+0.208994750 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute) Oct 14 04:53:03 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:53:03 localhost podman[93850]: 2025-10-14 08:53:03.899727348 +0000 UTC m=+0.232163945 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 04:53:03 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:53:03 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:53:04 localhost podman[93851]: 2025-10-14 08:53:04.196476604 +0000 UTC m=+0.522999629 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, architecture=x86_64) Oct 14 04:53:04 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:53:04 localhost systemd[1]: tmp-crun.1WN71y.mount: Deactivated successfully. Oct 14 04:53:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:53:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:53:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:53:06 localhost podman[93945]: 2025-10-14 08:53:06.757783279 +0000 UTC m=+0.094663412 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, version=17.1.9, name=rhosp17/openstack-ovn-controller, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:53:06 localhost podman[93945]: 2025-10-14 08:53:06.775053152 +0000 UTC m=+0.111933245 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, container_name=ovn_controller, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.9) Oct 14 04:53:06 localhost podman[93944]: 2025-10-14 08:53:06.726770172 +0000 UTC m=+0.067908407 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, architecture=x86_64, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Oct 14 04:53:06 localhost systemd[1]: tmp-crun.jxf1NP.mount: Deactivated successfully. Oct 14 04:53:06 localhost podman[93946]: 2025-10-14 08:53:06.817357687 +0000 UTC m=+0.152918910 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Oct 14 04:53:06 localhost podman[93945]: unhealthy Oct 14 04:53:06 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:53:06 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:53:06 localhost podman[93946]: 2025-10-14 08:53:06.852085619 +0000 UTC m=+0.187646842 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:53:06 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:53:06 localhost podman[93944]: 2025-10-14 08:53:06.914092143 +0000 UTC m=+0.255230338 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:53:06 localhost podman[93944]: unhealthy Oct 14 04:53:06 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:53:06 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:53:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:53:12 localhost systemd[1]: tmp-crun.gbkS9C.mount: Deactivated successfully. Oct 14 04:53:12 localhost podman[94006]: 2025-10-14 08:53:12.74411253 +0000 UTC m=+0.087621956 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, architecture=x86_64, build-date=2025-07-21T13:07:59, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, io.openshift.expose-services=) Oct 14 04:53:12 localhost podman[94006]: 2025-10-14 08:53:12.984374813 +0000 UTC m=+0.327884259 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, release=1, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 04:53:12 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:53:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:53:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:53:28 localhost podman[94080]: 2025-10-14 08:53:28.237354235 +0000 UTC m=+0.070529827 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, build-date=2025-07-21T13:27:15, container_name=iscsid, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.9) Oct 14 04:53:28 localhost podman[94079]: 2025-10-14 08:53:28.302767315 +0000 UTC m=+0.135936996 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=2, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, container_name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12) Oct 14 04:53:28 localhost podman[94080]: 2025-10-14 08:53:28.327658242 +0000 UTC m=+0.160833834 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, release=1, tcib_managed=true, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc.) Oct 14 04:53:28 localhost podman[94079]: 2025-10-14 08:53:28.339181098 +0000 UTC m=+0.172350779 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, release=2, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.33.12) Oct 14 04:53:28 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:53:28 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:53:29 localhost podman[94203]: Oct 14 04:53:29 localhost podman[94203]: 2025-10-14 08:53:29.353182687 +0000 UTC m=+0.081990812 container create e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_wing, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 04:53:29 localhost systemd[1]: Started libpod-conmon-e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda.scope. Oct 14 04:53:29 localhost podman[94203]: 2025-10-14 08:53:29.322645485 +0000 UTC m=+0.051453650 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 04:53:29 localhost systemd[1]: Started libcrun container. Oct 14 04:53:29 localhost podman[94203]: 2025-10-14 08:53:29.457721172 +0000 UTC m=+0.186529357 container init e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_wing, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 04:53:29 localhost systemd[1]: tmp-crun.jj9fv3.mount: Deactivated successfully. Oct 14 04:53:29 localhost podman[94203]: 2025-10-14 08:53:29.472786987 +0000 UTC m=+0.201595112 container start e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_wing, architecture=x86_64, release=553, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Oct 14 04:53:29 localhost podman[94203]: 2025-10-14 08:53:29.474743618 +0000 UTC m=+0.203551793 container attach e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_wing, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True) Oct 14 04:53:29 localhost blissful_wing[94219]: 167 167 Oct 14 04:53:29 localhost systemd[1]: libpod-e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda.scope: Deactivated successfully. Oct 14 04:53:29 localhost podman[94203]: 2025-10-14 08:53:29.479264717 +0000 UTC m=+0.208072872 container died e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_wing, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, version=7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git) Oct 14 04:53:29 localhost podman[94224]: 2025-10-14 08:53:29.553546809 +0000 UTC m=+0.064921974 container remove e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_wing, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux ) Oct 14 04:53:29 localhost systemd[1]: libpod-conmon-e09c8fa10724635aadfe523540112d96254fb44804875e935c077e5333f7afda.scope: Deactivated successfully. Oct 14 04:53:29 localhost podman[94245]: Oct 14 04:53:29 localhost podman[94245]: 2025-10-14 08:53:29.800573552 +0000 UTC m=+0.087708548 container create 940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_panini, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph, release=553, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, build-date=2025-09-24T08:57:55) Oct 14 04:53:29 localhost systemd[1]: Started libpod-conmon-940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154.scope. Oct 14 04:53:29 localhost systemd[1]: Started libcrun container. Oct 14 04:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c611f9f669e2c00f49392f261f97854b0f49f164a902890747c9f3afb5638/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 04:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c611f9f669e2c00f49392f261f97854b0f49f164a902890747c9f3afb5638/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 04:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c611f9f669e2c00f49392f261f97854b0f49f164a902890747c9f3afb5638/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 04:53:29 localhost podman[94245]: 2025-10-14 08:53:29.764774367 +0000 UTC m=+0.051909393 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 04:53:29 localhost podman[94245]: 2025-10-14 08:53:29.864608288 +0000 UTC m=+0.151743294 container init 940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_panini, name=rhceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 04:53:29 localhost podman[94245]: 2025-10-14 08:53:29.873591955 +0000 UTC m=+0.160726931 container start 940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_panini, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.buildah.version=1.33.12, GIT_CLEAN=True, release=553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 04:53:29 localhost podman[94245]: 2025-10-14 08:53:29.873951036 +0000 UTC m=+0.161086022 container attach 940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_panini, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, architecture=x86_64, release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., RELEASE=main) Oct 14 04:53:30 localhost systemd[1]: var-lib-containers-storage-overlay-38ff26ceb80f69943115fc35d763febe865b1f8c5bc9145da6ffa24cb5a2f4ca-merged.mount: Deactivated successfully. Oct 14 04:53:30 localhost cool_panini[94260]: [ Oct 14 04:53:30 localhost cool_panini[94260]: { Oct 14 04:53:30 localhost cool_panini[94260]: "available": false, Oct 14 04:53:30 localhost cool_panini[94260]: "ceph_device": false, Oct 14 04:53:30 localhost cool_panini[94260]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 04:53:30 localhost cool_panini[94260]: "lsm_data": {}, Oct 14 04:53:30 localhost cool_panini[94260]: "lvs": [], Oct 14 04:53:30 localhost cool_panini[94260]: "path": "/dev/sr0", Oct 14 04:53:30 localhost cool_panini[94260]: "rejected_reasons": [ Oct 14 04:53:30 localhost cool_panini[94260]: "Has a FileSystem", Oct 14 04:53:30 localhost cool_panini[94260]: "Insufficient space (<5GB)" Oct 14 04:53:30 localhost cool_panini[94260]: ], Oct 14 04:53:30 localhost cool_panini[94260]: "sys_api": { Oct 14 04:53:30 localhost cool_panini[94260]: "actuators": null, Oct 14 04:53:30 localhost cool_panini[94260]: "device_nodes": "sr0", Oct 14 04:53:30 localhost cool_panini[94260]: "human_readable_size": "482.00 KB", Oct 14 04:53:30 localhost cool_panini[94260]: "id_bus": "ata", Oct 14 04:53:30 localhost cool_panini[94260]: "model": "QEMU DVD-ROM", Oct 14 04:53:30 localhost cool_panini[94260]: "nr_requests": "2", Oct 14 04:53:30 localhost cool_panini[94260]: "partitions": {}, Oct 14 04:53:30 localhost cool_panini[94260]: "path": "/dev/sr0", Oct 14 04:53:30 localhost cool_panini[94260]: "removable": "1", Oct 14 04:53:30 localhost cool_panini[94260]: "rev": "2.5+", Oct 14 04:53:30 localhost cool_panini[94260]: "ro": "0", Oct 14 04:53:30 localhost cool_panini[94260]: "rotational": "1", Oct 14 04:53:30 localhost cool_panini[94260]: "sas_address": "", Oct 14 04:53:30 localhost cool_panini[94260]: "sas_device_handle": "", Oct 14 04:53:30 localhost cool_panini[94260]: "scheduler_mode": "mq-deadline", Oct 14 04:53:30 localhost cool_panini[94260]: "sectors": 0, Oct 14 04:53:30 localhost cool_panini[94260]: "sectorsize": "2048", Oct 14 04:53:30 localhost cool_panini[94260]: "size": 493568.0, Oct 14 04:53:30 localhost cool_panini[94260]: "support_discard": "0", Oct 14 04:53:30 localhost cool_panini[94260]: "type": "disk", Oct 14 04:53:30 localhost cool_panini[94260]: "vendor": "QEMU" Oct 14 04:53:30 localhost cool_panini[94260]: } Oct 14 04:53:30 localhost cool_panini[94260]: } Oct 14 04:53:30 localhost cool_panini[94260]: ] Oct 14 04:53:30 localhost systemd[1]: libpod-940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154.scope: Deactivated successfully. Oct 14 04:53:30 localhost systemd[1]: libpod-940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154.scope: Consumed 1.083s CPU time. Oct 14 04:53:30 localhost podman[94245]: 2025-10-14 08:53:30.986026191 +0000 UTC m=+1.273161157 container died 940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_panini, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 04:53:31 localhost systemd[1]: tmp-crun.PYUdwv.mount: Deactivated successfully. Oct 14 04:53:31 localhost systemd[1]: var-lib-containers-storage-overlay-171c611f9f669e2c00f49392f261f97854b0f49f164a902890747c9f3afb5638-merged.mount: Deactivated successfully. Oct 14 04:53:31 localhost podman[96117]: 2025-10-14 08:53:31.101850806 +0000 UTC m=+0.105463686 container remove 940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_panini, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 04:53:31 localhost systemd[1]: libpod-conmon-940864fc3aa2418f55338eaadb0d75ddd3da7d4477e92631dc1a48143d975154.scope: Deactivated successfully. Oct 14 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:53:34 localhost podman[96147]: 2025-10-14 08:53:34.768338473 +0000 UTC m=+0.101859825 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, tcib_managed=true, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git) Oct 14 04:53:34 localhost systemd[1]: tmp-crun.DvTtNo.mount: Deactivated successfully. Oct 14 04:53:34 localhost podman[96148]: 2025-10-14 08:53:34.830253873 +0000 UTC m=+0.159399999 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, distribution-scope=public, release=1, version=17.1.9, container_name=logrotate_crond, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:53:34 localhost podman[96148]: 2025-10-14 08:53:34.853081428 +0000 UTC m=+0.182227534 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.33.12, vcs-type=git, config_id=tripleo_step4, container_name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, build-date=2025-07-21T13:07:52, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron) Oct 14 04:53:34 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:53:34 localhost podman[96146]: 2025-10-14 08:53:34.871816495 +0000 UTC m=+0.204955475 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9) Oct 14 04:53:34 localhost podman[96146]: 2025-10-14 08:53:34.901230413 +0000 UTC m=+0.234369413 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 04:53:34 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:53:34 localhost podman[96149]: 2025-10-14 08:53:34.933597802 +0000 UTC m=+0.258138347 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 04:53:34 localhost podman[96149]: 2025-10-14 08:53:34.95913738 +0000 UTC m=+0.283677925 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64) Oct 14 04:53:34 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:53:35 localhost podman[96147]: 2025-10-14 08:53:35.164652272 +0000 UTC m=+0.498173544 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, container_name=nova_migration_target, tcib_managed=true, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:53:35 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:53:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:53:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:53:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:53:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:53:37 localhost recover_tripleo_nova_virtqemud[96262]: 62551 Oct 14 04:53:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:53:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:53:37 localhost systemd[1]: tmp-crun.Qox1Xt.mount: Deactivated successfully. Oct 14 04:53:37 localhost podman[96241]: 2025-10-14 08:53:37.78601049 +0000 UTC m=+0.123251524 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent) Oct 14 04:53:37 localhost podman[96241]: 2025-10-14 08:53:37.802191839 +0000 UTC m=+0.139432873 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true) Oct 14 04:53:37 localhost podman[96241]: unhealthy Oct 14 04:53:37 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:53:37 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:53:37 localhost podman[96242]: 2025-10-14 08:53:37.753272129 +0000 UTC m=+0.089512863 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, architecture=x86_64, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller) Oct 14 04:53:37 localhost podman[96243]: 2025-10-14 08:53:37.853181243 +0000 UTC m=+0.183184975 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true) Oct 14 04:53:37 localhost podman[96242]: 2025-10-14 08:53:37.886100648 +0000 UTC m=+0.222341372 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, container_name=ovn_controller, distribution-scope=public) Oct 14 04:53:37 localhost podman[96242]: unhealthy Oct 14 04:53:37 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:53:37 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:53:37 localhost podman[96243]: 2025-10-14 08:53:37.939148555 +0000 UTC m=+0.269152277 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, version=17.1.9, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_id=tripleo_step5, managed_by=tripleo_ansible, batch=17.1_20250721.1) Oct 14 04:53:37 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:53:38 localhost systemd[1]: tmp-crun.f1D2Si.mount: Deactivated successfully. Oct 14 04:53:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:53:43 localhost systemd[1]: tmp-crun.cSWDWy.mount: Deactivated successfully. Oct 14 04:53:43 localhost podman[96310]: 2025-10-14 08:53:43.75224936 +0000 UTC m=+0.090531344 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:53:43 localhost podman[96310]: 2025-10-14 08:53:43.958449643 +0000 UTC m=+0.296731647 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, config_id=tripleo_step1, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:53:43 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:53:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:53:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:53:58 localhost systemd[1]: tmp-crun.5ZEaNV.mount: Deactivated successfully. Oct 14 04:53:58 localhost podman[96340]: 2025-10-14 08:53:58.747271754 +0000 UTC m=+0.090782062 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:53:58 localhost podman[96340]: 2025-10-14 08:53:58.755591631 +0000 UTC m=+0.099101939 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, vcs-type=git, release=2, architecture=x86_64, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container) Oct 14 04:53:58 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:53:58 localhost podman[96341]: 2025-10-14 08:53:58.806352177 +0000 UTC m=+0.141989822 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1, vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:53:58 localhost podman[96341]: 2025-10-14 08:53:58.844210486 +0000 UTC m=+0.179848111 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Oct 14 04:53:58 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:54:05 localhost systemd[1]: tmp-crun.1xvyHW.mount: Deactivated successfully. Oct 14 04:54:05 localhost podman[96381]: 2025-10-14 08:54:05.782138388 +0000 UTC m=+0.108280431 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, version=17.1.9, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52) Oct 14 04:54:05 localhost podman[96380]: 2025-10-14 08:54:05.742737273 +0000 UTC m=+0.078863155 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:54:05 localhost podman[96381]: 2025-10-14 08:54:05.81817117 +0000 UTC m=+0.144313223 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.9, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, name=rhosp17/openstack-cron, vendor=Red Hat, Inc.) Oct 14 04:54:05 localhost podman[96393]: 2025-10-14 08:54:05.825309631 +0000 UTC m=+0.142824928 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12) Oct 14 04:54:05 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:54:05 localhost podman[96379]: 2025-10-14 08:54:05.793870401 +0000 UTC m=+0.129923370 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 04:54:05 localhost podman[96379]: 2025-10-14 08:54:05.875969684 +0000 UTC m=+0.212022703 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.9) Oct 14 04:54:05 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:54:05 localhost podman[96393]: 2025-10-14 08:54:05.897767907 +0000 UTC m=+0.215283274 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64) Oct 14 04:54:05 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:54:06 localhost podman[96380]: 2025-10-14 08:54:06.08908704 +0000 UTC m=+0.425212952 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, batch=17.1_20250721.1, container_name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public) Oct 14 04:54:06 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:54:06 localhost systemd[1]: tmp-crun.tPhPbW.mount: Deactivated successfully. Oct 14 04:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:54:08 localhost systemd[1]: tmp-crun.2mD0W4.mount: Deactivated successfully. Oct 14 04:54:08 localhost podman[96473]: 2025-10-14 08:54:08.747665866 +0000 UTC m=+0.081243828 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, release=1, architecture=x86_64, container_name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:54:08 localhost podman[96473]: 2025-10-14 08:54:08.761950427 +0000 UTC m=+0.095528309 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:54:08 localhost podman[96473]: unhealthy Oct 14 04:54:08 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:54:08 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:54:08 localhost systemd[1]: tmp-crun.LUpHRq.mount: Deactivated successfully. Oct 14 04:54:08 localhost podman[96474]: 2025-10-14 08:54:08.80485331 +0000 UTC m=+0.129124155 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, vcs-type=git, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:54:08 localhost podman[96474]: 2025-10-14 08:54:08.844343139 +0000 UTC m=+0.168613994 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc.) Oct 14 04:54:08 localhost podman[96474]: unhealthy Oct 14 04:54:08 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:54:08 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:54:08 localhost podman[96476]: 2025-10-14 08:54:08.846216317 +0000 UTC m=+0.171256495 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37) Oct 14 04:54:08 localhost podman[96476]: 2025-10-14 08:54:08.929216428 +0000 UTC m=+0.254256636 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, tcib_managed=true) Oct 14 04:54:08 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:54:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:54:14 localhost systemd[1]: tmp-crun.Qhx6MU.mount: Deactivated successfully. Oct 14 04:54:14 localhost podman[96535]: 2025-10-14 08:54:14.746126341 +0000 UTC m=+0.081655281 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible) Oct 14 04:54:14 localhost podman[96535]: 2025-10-14 08:54:14.947245737 +0000 UTC m=+0.282774667 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public) Oct 14 04:54:14 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:54:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:54:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:54:29 localhost podman[96564]: 2025-10-14 08:54:29.712096799 +0000 UTC m=+0.057763864 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, build-date=2025-07-21T13:04:03, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2) Oct 14 04:54:29 localhost systemd[1]: tmp-crun.q9g1hU.mount: Deactivated successfully. Oct 14 04:54:29 localhost podman[96565]: 2025-10-14 08:54:29.770836201 +0000 UTC m=+0.115039181 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:54:29 localhost podman[96564]: 2025-10-14 08:54:29.794158771 +0000 UTC m=+0.139825866 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.component=openstack-collectd-container, release=2, container_name=collectd, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:54:29 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:54:29 localhost podman[96565]: 2025-10-14 08:54:29.806596214 +0000 UTC m=+0.150799194 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1) Oct 14 04:54:29 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:54:36 localhost systemd[1]: tmp-crun.QWmxsc.mount: Deactivated successfully. Oct 14 04:54:36 localhost podman[96730]: 2025-10-14 08:54:36.767707084 +0000 UTC m=+0.102707100 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 04:54:36 localhost podman[96730]: 2025-10-14 08:54:36.803030095 +0000 UTC m=+0.138030081 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 04:54:36 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:54:36 localhost podman[96732]: 2025-10-14 08:54:36.82946112 +0000 UTC m=+0.159275806 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, config_id=tripleo_step4, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 04:54:36 localhost podman[96732]: 2025-10-14 08:54:36.840230432 +0000 UTC m=+0.170045088 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9) Oct 14 04:54:36 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:54:36 localhost podman[96733]: 2025-10-14 08:54:36.926796844 +0000 UTC m=+0.251942456 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git) Oct 14 04:54:36 localhost podman[96731]: 2025-10-14 08:54:36.978981124 +0000 UTC m=+0.314894568 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9) Oct 14 04:54:36 localhost podman[96733]: 2025-10-14 08:54:36.98825934 +0000 UTC m=+0.313404872 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9) Oct 14 04:54:37 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:54:37 localhost podman[96731]: 2025-10-14 08:54:37.371092994 +0000 UTC m=+0.707006478 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:54:37 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:54:39 localhost podman[96825]: 2025-10-14 08:54:39.746849983 +0000 UTC m=+0.085089577 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, distribution-scope=public, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.9, release=1, maintainer=OpenStack TripleO Team) Oct 14 04:54:39 localhost podman[96826]: 2025-10-14 08:54:39.724743601 +0000 UTC m=+0.064304125 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64) Oct 14 04:54:39 localhost systemd[1]: tmp-crun.vEnTdd.mount: Deactivated successfully. Oct 14 04:54:39 localhost podman[96824]: 2025-10-14 08:54:39.79180434 +0000 UTC m=+0.133050017 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-07-21T16:28:53, vcs-type=git, architecture=x86_64, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Oct 14 04:54:39 localhost podman[96826]: 2025-10-14 08:54:39.808163744 +0000 UTC m=+0.147724288 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, vcs-type=git, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2025-07-21T14:48:37, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Oct 14 04:54:39 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:54:39 localhost podman[96824]: 2025-10-14 08:54:39.862234583 +0000 UTC m=+0.203480240 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:54:39 localhost podman[96825]: 2025-10-14 08:54:39.86279288 +0000 UTC m=+0.201032504 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.9, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=) Oct 14 04:54:39 localhost podman[96825]: unhealthy Oct 14 04:54:39 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:54:39 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:54:39 localhost podman[96824]: unhealthy Oct 14 04:54:39 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:54:39 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:54:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:54:45 localhost podman[96888]: 2025-10-14 08:54:45.748723043 +0000 UTC m=+0.088974437 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:54:45 localhost podman[96888]: 2025-10-14 08:54:45.993194466 +0000 UTC m=+0.333445860 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:54:46 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:55:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:55:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:55:00 localhost systemd[1]: tmp-crun.PCkALt.mount: Deactivated successfully. Oct 14 04:55:00 localhost podman[96918]: 2025-10-14 08:55:00.767977704 +0000 UTC m=+0.106648591 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.9, container_name=collectd, release=2, architecture=x86_64, tcib_managed=true, build-date=2025-07-21T13:04:03) Oct 14 04:55:00 localhost podman[96918]: 2025-10-14 08:55:00.781133861 +0000 UTC m=+0.119804808 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., release=2, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:55:00 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:55:00 localhost systemd[1]: tmp-crun.UuEkYA.mount: Deactivated successfully. Oct 14 04:55:00 localhost podman[96919]: 2025-10-14 08:55:00.858048755 +0000 UTC m=+0.193890615 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, release=1, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:55:00 localhost podman[96919]: 2025-10-14 08:55:00.869936751 +0000 UTC m=+0.205778591 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, distribution-scope=public, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, version=17.1.9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 04:55:00 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:55:07 localhost systemd[1]: tmp-crun.JDC7M7.mount: Deactivated successfully. Oct 14 04:55:07 localhost podman[96958]: 2025-10-14 08:55:07.724233565 +0000 UTC m=+0.063079158 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, release=1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:55:07 localhost podman[96959]: 2025-10-14 08:55:07.78729332 +0000 UTC m=+0.121349005 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 04:55:07 localhost podman[96959]: 2025-10-14 08:55:07.795894186 +0000 UTC m=+0.129949871 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:55:07 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:55:07 localhost podman[96957]: 2025-10-14 08:55:07.844259108 +0000 UTC m=+0.180989715 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, version=17.1.9, release=1, vendor=Red Hat, Inc.) Oct 14 04:55:07 localhost podman[96960]: 2025-10-14 08:55:07.890490125 +0000 UTC m=+0.223781086 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-07-21T14:45:33, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute) Oct 14 04:55:07 localhost podman[96957]: 2025-10-14 08:55:07.902128624 +0000 UTC m=+0.238859191 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:55:07 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:55:07 localhost podman[96960]: 2025-10-14 08:55:07.921611315 +0000 UTC m=+0.254902246 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:55:07 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:55:08 localhost podman[96958]: 2025-10-14 08:55:08.136380333 +0000 UTC m=+0.475225916 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, distribution-scope=public, release=1) Oct 14 04:55:08 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:55:08 localhost systemd[1]: tmp-crun.KZQ7DR.mount: Deactivated successfully. Oct 14 04:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:55:10 localhost systemd[1]: tmp-crun.SFBbKp.mount: Deactivated successfully. Oct 14 04:55:10 localhost podman[97052]: 2025-10-14 08:55:10.725317239 +0000 UTC m=+0.067480693 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, architecture=x86_64, container_name=ovn_metadata_agent, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Oct 14 04:55:10 localhost podman[97056]: 2025-10-14 08:55:10.737054771 +0000 UTC m=+0.069672651 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, name=rhosp17/openstack-nova-compute, release=1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:55:10 localhost podman[97056]: 2025-10-14 08:55:10.762084973 +0000 UTC m=+0.094702843 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, vcs-type=git, release=1, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, architecture=x86_64) Oct 14 04:55:10 localhost podman[97053]: 2025-10-14 08:55:10.770631357 +0000 UTC m=+0.107798747 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, release=1) Oct 14 04:55:10 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:55:10 localhost podman[97053]: 2025-10-14 08:55:10.784084592 +0000 UTC m=+0.121252002 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, container_name=ovn_controller, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 04:55:10 localhost podman[97053]: unhealthy Oct 14 04:55:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:55:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:55:10 localhost podman[97052]: 2025-10-14 08:55:10.81996842 +0000 UTC m=+0.162131864 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, release=1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:55:10 localhost podman[97052]: unhealthy Oct 14 04:55:10 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:55:10 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:55:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:55:16 localhost systemd[1]: tmp-crun.B8tsep.mount: Deactivated successfully. Oct 14 04:55:16 localhost podman[97114]: 2025-10-14 08:55:16.750837329 +0000 UTC m=+0.097875732 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Oct 14 04:55:16 localhost podman[97114]: 2025-10-14 08:55:16.97545813 +0000 UTC m=+0.322496533 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:07:59, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container) Oct 14 04:55:16 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:55:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:55:30 localhost recover_tripleo_nova_virtqemud[97144]: 62551 Oct 14 04:55:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:55:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:55:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:55:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:55:31 localhost systemd[1]: tmp-crun.fFzHUB.mount: Deactivated successfully. Oct 14 04:55:31 localhost podman[97145]: 2025-10-14 08:55:31.740164085 +0000 UTC m=+0.077672307 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-collectd, version=17.1.9, config_id=tripleo_step3, release=2, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Oct 14 04:55:31 localhost podman[97146]: 2025-10-14 08:55:31.752840797 +0000 UTC m=+0.084108666 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git) Oct 14 04:55:31 localhost podman[97146]: 2025-10-14 08:55:31.764344832 +0000 UTC m=+0.095612701 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step3, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1, vcs-type=git) Oct 14 04:55:31 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:55:31 localhost podman[97145]: 2025-10-14 08:55:31.80607635 +0000 UTC m=+0.143584602 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible) Oct 14 04:55:31 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:55:38 localhost systemd[1]: tmp-crun.poy6KF.mount: Deactivated successfully. Oct 14 04:55:38 localhost podman[97259]: 2025-10-14 08:55:38.860631433 +0000 UTC m=+0.194351778 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, maintainer=OpenStack TripleO Team, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, distribution-scope=public) Oct 14 04:55:38 localhost podman[97261]: 2025-10-14 08:55:38.771275756 +0000 UTC m=+0.100147341 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, distribution-scope=public, release=1, architecture=x86_64, build-date=2025-07-21T13:07:52, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:55:38 localhost podman[97261]: 2025-10-14 08:55:38.904353152 +0000 UTC m=+0.233224767 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, vcs-type=git, io.buildah.version=1.33.12, release=1, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:55:38 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:55:38 localhost podman[97259]: 2025-10-14 08:55:38.923150122 +0000 UTC m=+0.256870437 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12) Oct 14 04:55:38 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:55:38 localhost podman[97262]: 2025-10-14 08:55:38.822471446 +0000 UTC m=+0.151465736 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Oct 14 04:55:38 localhost podman[97260]: 2025-10-14 08:55:38.929430176 +0000 UTC m=+0.263002296 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1, build-date=2025-07-21T14:48:37, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:55:39 localhost podman[97262]: 2025-10-14 08:55:39.005176273 +0000 UTC m=+0.334170613 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, batch=17.1_20250721.1) Oct 14 04:55:39 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:55:39 localhost podman[97260]: 2025-10-14 08:55:39.327141198 +0000 UTC m=+0.660713298 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9) Oct 14 04:55:39 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:55:39 localhost systemd[1]: tmp-crun.rvNyZt.mount: Deactivated successfully. Oct 14 04:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:55:41 localhost systemd[1]: tmp-crun.Zz4diK.mount: Deactivated successfully. Oct 14 04:55:41 localhost podman[97356]: 2025-10-14 08:55:41.760115123 +0000 UTC m=+0.097025034 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12) Oct 14 04:55:41 localhost systemd[1]: tmp-crun.i24C9W.mount: Deactivated successfully. Oct 14 04:55:41 localhost podman[97356]: 2025-10-14 08:55:41.805116702 +0000 UTC m=+0.142026553 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20250721.1, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, vcs-type=git, config_id=tripleo_step4) Oct 14 04:55:41 localhost podman[97356]: unhealthy Oct 14 04:55:41 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:55:41 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:55:41 localhost podman[97357]: 2025-10-14 08:55:41.806381311 +0000 UTC m=+0.137241266 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1) Oct 14 04:55:41 localhost podman[97358]: 2025-10-14 08:55:41.860615194 +0000 UTC m=+0.187973031 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:55:41 localhost podman[97357]: 2025-10-14 08:55:41.888175655 +0000 UTC m=+0.219035630 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:55:41 localhost podman[97357]: unhealthy Oct 14 04:55:41 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:55:41 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:55:41 localhost podman[97358]: 2025-10-14 08:55:41.910762412 +0000 UTC m=+0.238120229 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, vcs-type=git, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-nova-compute-container) Oct 14 04:55:41 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:55:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:55:47 localhost podman[97421]: 2025-10-14 08:55:47.738554922 +0000 UTC m=+0.079295898 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, tcib_managed=true) Oct 14 04:55:47 localhost podman[97421]: 2025-10-14 08:55:47.924014614 +0000 UTC m=+0.264755580 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, release=1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, managed_by=tripleo_ansible) Oct 14 04:55:47 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:56:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:56:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:56:02 localhost podman[97450]: 2025-10-14 08:56:02.746041319 +0000 UTC m=+0.082786685 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, version=17.1.9, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:56:02 localhost podman[97450]: 2025-10-14 08:56:02.760073972 +0000 UTC m=+0.096819268 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.9, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public) Oct 14 04:56:02 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:56:02 localhost systemd[1]: tmp-crun.u0Umd0.mount: Deactivated successfully. Oct 14 04:56:02 localhost podman[97449]: 2025-10-14 08:56:02.864482345 +0000 UTC m=+0.203103729 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, release=2, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:56:02 localhost podman[97449]: 2025-10-14 08:56:02.880404385 +0000 UTC m=+0.219025759 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, tcib_managed=true, name=rhosp17/openstack-collectd) Oct 14 04:56:02 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:56:09 localhost systemd[1]: tmp-crun.WUWhGz.mount: Deactivated successfully. Oct 14 04:56:09 localhost podman[97493]: 2025-10-14 08:56:09.75380343 +0000 UTC m=+0.081507686 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, architecture=x86_64, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible) Oct 14 04:56:09 localhost systemd[1]: tmp-crun.Q99qLl.mount: Deactivated successfully. Oct 14 04:56:09 localhost podman[97490]: 2025-10-14 08:56:09.800923414 +0000 UTC m=+0.136821323 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1) Oct 14 04:56:09 localhost podman[97493]: 2025-10-14 08:56:09.80919786 +0000 UTC m=+0.136902146 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 04:56:09 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:56:09 localhost podman[97489]: 2025-10-14 08:56:09.850932497 +0000 UTC m=+0.191856821 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible) Oct 14 04:56:09 localhost podman[97491]: 2025-10-14 08:56:09.911550287 +0000 UTC m=+0.245707392 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, build-date=2025-07-21T13:07:52, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, container_name=logrotate_crond) Oct 14 04:56:09 localhost podman[97491]: 2025-10-14 08:56:09.924416125 +0000 UTC m=+0.258573220 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.9, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 04:56:09 localhost podman[97489]: 2025-10-14 08:56:09.930495202 +0000 UTC m=+0.271419516 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true) Oct 14 04:56:09 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:56:09 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:56:10 localhost podman[97490]: 2025-10-14 08:56:10.15792104 +0000 UTC m=+0.493818989 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, distribution-scope=public) Oct 14 04:56:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:56:12 localhost podman[97585]: 2025-10-14 08:56:12.749772635 +0000 UTC m=+0.085416756 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vcs-type=git) Oct 14 04:56:12 localhost podman[97585]: 2025-10-14 08:56:12.780008218 +0000 UTC m=+0.115652349 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, release=1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:56:12 localhost systemd[1]: tmp-crun.mOINvC.mount: Deactivated successfully. Oct 14 04:56:12 localhost podman[97583]: 2025-10-14 08:56:12.798395096 +0000 UTC m=+0.140279030 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20250721.1) Oct 14 04:56:12 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:56:12 localhost podman[97583]: 2025-10-14 08:56:12.814300406 +0000 UTC m=+0.156184380 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git) Oct 14 04:56:12 localhost podman[97584]: 2025-10-14 08:56:12.853790055 +0000 UTC m=+0.191166600 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, container_name=ovn_controller, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44) Oct 14 04:56:12 localhost podman[97583]: unhealthy Oct 14 04:56:12 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:56:12 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:56:12 localhost podman[97584]: 2025-10-14 08:56:12.900355752 +0000 UTC m=+0.237732247 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, version=17.1.9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T13:28:44) Oct 14 04:56:12 localhost podman[97584]: unhealthy Oct 14 04:56:12 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:56:12 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:56:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:56:18 localhost podman[97647]: 2025-10-14 08:56:18.747098354 +0000 UTC m=+0.083084184 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 04:56:18 localhost podman[97647]: 2025-10-14 08:56:18.953269736 +0000 UTC m=+0.289255646 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.9) Oct 14 04:56:18 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:56:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:56:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:56:33 localhost podman[97677]: 2025-10-14 08:56:33.753661974 +0000 UTC m=+0.086584843 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 04:56:33 localhost podman[97677]: 2025-10-14 08:56:33.764065005 +0000 UTC m=+0.096987824 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid) Oct 14 04:56:33 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:56:33 localhost podman[97676]: 2025-10-14 08:56:33.726409923 +0000 UTC m=+0.066217734 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, release=2, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=collectd, version=17.1.9, build-date=2025-07-21T13:04:03, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd) Oct 14 04:56:33 localhost podman[97676]: 2025-10-14 08:56:33.809096605 +0000 UTC m=+0.148904356 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 04:56:33 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:56:36 localhost systemd[1]: tmp-crun.lwyZSp.mount: Deactivated successfully. Oct 14 04:56:36 localhost podman[97819]: 2025-10-14 08:56:36.329974822 +0000 UTC m=+0.072137487 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public) Oct 14 04:56:36 localhost podman[97819]: 2025-10-14 08:56:36.439181472 +0000 UTC m=+0.181344207 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 04:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:56:40 localhost systemd[1]: tmp-crun.Z5V9HS.mount: Deactivated successfully. Oct 14 04:56:40 localhost podman[97964]: 2025-10-14 08:56:40.750516997 +0000 UTC m=+0.088929696 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20250721.1) Oct 14 04:56:40 localhost podman[97964]: 2025-10-14 08:56:40.783027249 +0000 UTC m=+0.121439938 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 04:56:40 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:56:40 localhost podman[97967]: 2025-10-14 08:56:40.795179225 +0000 UTC m=+0.124592656 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, release=1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:56:40 localhost podman[97965]: 2025-10-14 08:56:40.856047862 +0000 UTC m=+0.193959746 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:56:40 localhost podman[97967]: 2025-10-14 08:56:40.858059165 +0000 UTC m=+0.187472626 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, release=1, container_name=ceilometer_agent_compute) Oct 14 04:56:40 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:56:40 localhost podman[97966]: 2025-10-14 08:56:40.911737901 +0000 UTC m=+0.243335310 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-cron, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=logrotate_crond, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, distribution-scope=public, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4) Oct 14 04:56:40 localhost podman[97966]: 2025-10-14 08:56:40.951761997 +0000 UTC m=+0.283359426 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, release=1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, batch=17.1_20250721.1) Oct 14 04:56:40 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:56:41 localhost podman[97965]: 2025-10-14 08:56:41.22023132 +0000 UTC m=+0.558143204 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, vcs-type=git, build-date=2025-07-21T14:48:37, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target) Oct 14 04:56:41 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:56:43 localhost systemd[1]: tmp-crun.nIgOWs.mount: Deactivated successfully. Oct 14 04:56:43 localhost podman[98060]: 2025-10-14 08:56:43.763186938 +0000 UTC m=+0.094555289 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step5, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:56:43 localhost systemd[1]: tmp-crun.av0jIo.mount: Deactivated successfully. Oct 14 04:56:43 localhost podman[98060]: 2025-10-14 08:56:43.797021132 +0000 UTC m=+0.128389483 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:56:43 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:56:43 localhost podman[98058]: 2025-10-14 08:56:43.847557271 +0000 UTC m=+0.185838845 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, io.openshift.expose-services=) Oct 14 04:56:43 localhost podman[98058]: 2025-10-14 08:56:43.861568393 +0000 UTC m=+0.199849997 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9) Oct 14 04:56:43 localhost podman[98058]: unhealthy Oct 14 04:56:43 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:56:43 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:56:43 localhost podman[98059]: 2025-10-14 08:56:43.801513051 +0000 UTC m=+0.139326881 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 04:56:43 localhost podman[98059]: 2025-10-14 08:56:43.931181912 +0000 UTC m=+0.268995712 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Oct 14 04:56:43 localhost podman[98059]: unhealthy Oct 14 04:56:43 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:56:43 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:56:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:56:49 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:56:49 localhost recover_tripleo_nova_virtqemud[98131]: 62551 Oct 14 04:56:49 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:56:49 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:56:49 localhost podman[98124]: 2025-10-14 08:56:49.754803592 +0000 UTC m=+0.090817674 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr) Oct 14 04:56:49 localhost podman[98124]: 2025-10-14 08:56:49.922027583 +0000 UTC m=+0.258041655 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public) Oct 14 04:56:49 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:57:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:57:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:57:04 localhost podman[98155]: 2025-10-14 08:57:04.742287802 +0000 UTC m=+0.081577368 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=2, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.9) Oct 14 04:57:04 localhost podman[98155]: 2025-10-14 08:57:04.788303031 +0000 UTC m=+0.127592597 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=) Oct 14 04:57:04 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:57:04 localhost podman[98156]: 2025-10-14 08:57:04.789582731 +0000 UTC m=+0.128419764 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, vcs-type=git, tcib_managed=true, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, release=1, managed_by=tripleo_ansible) Oct 14 04:57:04 localhost podman[98156]: 2025-10-14 08:57:04.875250694 +0000 UTC m=+0.214087747 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-07-21T13:27:15, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, name=rhosp17/openstack-iscsid, release=1, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Oct 14 04:57:04 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:57:11 localhost systemd[1]: tmp-crun.0qeWVm.mount: Deactivated successfully. Oct 14 04:57:11 localhost podman[98194]: 2025-10-14 08:57:11.75584151 +0000 UTC m=+0.092354309 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public) Oct 14 04:57:11 localhost systemd[1]: tmp-crun.fISw5V.mount: Deactivated successfully. Oct 14 04:57:11 localhost podman[98196]: 2025-10-14 08:57:11.804461241 +0000 UTC m=+0.136076880 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.9, release=1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4) Oct 14 04:57:11 localhost podman[98196]: 2025-10-14 08:57:11.818165264 +0000 UTC m=+0.149780883 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, tcib_managed=true, release=1, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4) Oct 14 04:57:11 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:57:11 localhost podman[98194]: 2025-10-14 08:57:11.857926481 +0000 UTC m=+0.194439290 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, vendor=Red Hat, Inc., release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:57:11 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:57:11 localhost podman[98195]: 2025-10-14 08:57:11.871935873 +0000 UTC m=+0.205927375 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, version=17.1.9, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, release=1, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64) Oct 14 04:57:11 localhost podman[98197]: 2025-10-14 08:57:11.918866512 +0000 UTC m=+0.248934593 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, tcib_managed=true, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:57:11 localhost podman[98197]: 2025-10-14 08:57:11.956049289 +0000 UTC m=+0.286117400 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 04:57:11 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:57:12 localhost podman[98195]: 2025-10-14 08:57:12.257827841 +0000 UTC m=+0.591819383 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, vcs-type=git) Oct 14 04:57:12 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:57:14 localhost systemd[1]: tmp-crun.R8YWLQ.mount: Deactivated successfully. Oct 14 04:57:14 localhost systemd[1]: tmp-crun.BA9RpJ.mount: Deactivated successfully. Oct 14 04:57:14 localhost podman[98291]: 2025-10-14 08:57:14.733196753 +0000 UTC m=+0.075282294 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_id=tripleo_step4, batch=17.1_20250721.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public) Oct 14 04:57:14 localhost podman[98292]: 2025-10-14 08:57:14.76616011 +0000 UTC m=+0.102776593 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 04:57:14 localhost podman[98291]: 2025-10-14 08:57:14.823180329 +0000 UTC m=+0.165265800 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64) Oct 14 04:57:14 localhost podman[98291]: unhealthy Oct 14 04:57:14 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:57:14 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:57:14 localhost podman[98290]: 2025-10-14 08:57:14.791496802 +0000 UTC m=+0.134871793 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:57:14 localhost podman[98290]: 2025-10-14 08:57:14.872738689 +0000 UTC m=+0.216113720 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:57:14 localhost podman[98290]: unhealthy Oct 14 04:57:14 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:57:14 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:57:14 localhost podman[98292]: 2025-10-14 08:57:14.896620106 +0000 UTC m=+0.233236629 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git) Oct 14 04:57:14 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:57:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:57:20 localhost podman[98356]: 2025-10-14 08:57:20.733795662 +0000 UTC m=+0.078251876 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1, architecture=x86_64, version=17.1.9, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team) Oct 14 04:57:20 localhost podman[98356]: 2025-10-14 08:57:20.965236154 +0000 UTC m=+0.309692358 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd) Oct 14 04:57:20 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:57:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:57:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:57:35 localhost systemd[1]: tmp-crun.rUWeO5.mount: Deactivated successfully. Oct 14 04:57:35 localhost podman[98385]: 2025-10-14 08:57:35.752409811 +0000 UTC m=+0.091527274 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=2, version=17.1.9, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:57:35 localhost podman[98386]: 2025-10-14 08:57:35.79610345 +0000 UTC m=+0.131993774 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Oct 14 04:57:35 localhost podman[98385]: 2025-10-14 08:57:35.820841553 +0000 UTC m=+0.159959486 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, release=2) Oct 14 04:57:35 localhost podman[98386]: 2025-10-14 08:57:35.833304658 +0000 UTC m=+0.169195022 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1, version=17.1.9, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, container_name=iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15) Oct 14 04:57:35 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:57:35 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:57:42 localhost podman[98503]: 2025-10-14 08:57:42.769945763 +0000 UTC m=+0.102327378 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=17.1.9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:57:42 localhost podman[98505]: 2025-10-14 08:57:42.806983666 +0000 UTC m=+0.136349988 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, io.openshift.expose-services=) Oct 14 04:57:42 localhost podman[98504]: 2025-10-14 08:57:42.866614386 +0000 UTC m=+0.199705963 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, tcib_managed=true, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 04:57:42 localhost podman[98503]: 2025-10-14 08:57:42.913098131 +0000 UTC m=+0.245479706 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=) Oct 14 04:57:42 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:57:42 localhost podman[98506]: 2025-10-14 08:57:42.931166158 +0000 UTC m=+0.256657111 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team) Oct 14 04:57:42 localhost podman[98505]: 2025-10-14 08:57:42.944314624 +0000 UTC m=+0.273681046 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:57:42 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:57:42 localhost podman[98506]: 2025-10-14 08:57:42.996019219 +0000 UTC m=+0.321510122 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:57:43 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:57:43 localhost podman[98504]: 2025-10-14 08:57:43.237614364 +0000 UTC m=+0.570705941 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 04:57:43 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:57:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:57:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:57:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:57:45 localhost podman[98598]: 2025-10-14 08:57:45.74778058 +0000 UTC m=+0.085981623 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 04:57:45 localhost podman[98598]: 2025-10-14 08:57:45.788290861 +0000 UTC m=+0.126491854 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9) Oct 14 04:57:45 localhost podman[98598]: unhealthy Oct 14 04:57:45 localhost systemd[1]: tmp-crun.Oh2nqJ.mount: Deactivated successfully. Oct 14 04:57:45 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:57:45 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:57:45 localhost podman[98600]: 2025-10-14 08:57:45.814439287 +0000 UTC m=+0.146173391 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Oct 14 04:57:45 localhost podman[98600]: 2025-10-14 08:57:45.846127136 +0000 UTC m=+0.177861220 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9) Oct 14 04:57:45 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:57:45 localhost podman[98599]: 2025-10-14 08:57:45.869488906 +0000 UTC m=+0.204007816 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-07-21T13:28:44, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 04:57:45 localhost podman[98599]: 2025-10-14 08:57:45.885231221 +0000 UTC m=+0.219750151 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1) Oct 14 04:57:45 localhost podman[98599]: unhealthy Oct 14 04:57:45 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:57:45 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:57:46 localhost systemd[1]: tmp-crun.NU7Cpj.mount: Deactivated successfully. Oct 14 04:57:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:57:51 localhost systemd[1]: tmp-crun.j8lSzn.mount: Deactivated successfully. Oct 14 04:57:51 localhost podman[98664]: 2025-10-14 08:57:51.774498119 +0000 UTC m=+0.083619451 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr) Oct 14 04:57:51 localhost podman[98664]: 2025-10-14 08:57:51.960924371 +0000 UTC m=+0.270045683 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.buildah.version=1.33.12, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:57:51 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:57:51 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:57:52 localhost recover_tripleo_nova_virtqemud[98695]: 62551 Oct 14 04:57:52 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:57:52 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:58:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:58:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:58:06 localhost systemd[1]: tmp-crun.3VmuRR.mount: Deactivated successfully. Oct 14 04:58:06 localhost podman[98697]: 2025-10-14 08:58:06.738085002 +0000 UTC m=+0.072777337 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Oct 14 04:58:06 localhost podman[98697]: 2025-10-14 08:58:06.772718881 +0000 UTC m=+0.107411246 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, container_name=iscsid, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container) Oct 14 04:58:06 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:58:06 localhost podman[98696]: 2025-10-14 08:58:06.814747538 +0000 UTC m=+0.151153635 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=2, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:58:06 localhost podman[98696]: 2025-10-14 08:58:06.824170859 +0000 UTC m=+0.160576906 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd) Oct 14 04:58:06 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:58:13 localhost systemd[1]: tmp-crun.cxR8FG.mount: Deactivated successfully. Oct 14 04:58:13 localhost systemd[1]: tmp-crun.EDEsTW.mount: Deactivated successfully. Oct 14 04:58:13 localhost podman[98735]: 2025-10-14 08:58:13.812939664 +0000 UTC m=+0.151809066 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.buildah.version=1.33.12, release=1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64) Oct 14 04:58:13 localhost podman[98734]: 2025-10-14 08:58:13.77360493 +0000 UTC m=+0.115984021 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-07-21T15:29:47, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:58:13 localhost podman[98736]: 2025-10-14 08:58:13.864184005 +0000 UTC m=+0.200763797 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-cron, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=) Oct 14 04:58:13 localhost podman[98736]: 2025-10-14 08:58:13.87405682 +0000 UTC m=+0.210636592 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, architecture=x86_64, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron) Oct 14 04:58:13 localhost podman[98734]: 2025-10-14 08:58:13.908040978 +0000 UTC m=+0.250419989 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:58:13 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:58:13 localhost podman[98737]: 2025-10-14 08:58:13.923265588 +0000 UTC m=+0.256664092 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, release=1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:58:13 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:58:13 localhost podman[98737]: 2025-10-14 08:58:13.989195302 +0000 UTC m=+0.322593756 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute) Oct 14 04:58:14 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:58:14 localhost podman[98735]: 2025-10-14 08:58:14.209602113 +0000 UTC m=+0.548471515 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-nova-compute-container, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:58:14 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:58:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:58:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:58:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:58:16 localhost podman[98827]: 2025-10-14 08:58:16.732197974 +0000 UTC m=+0.067547816 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Oct 14 04:58:16 localhost podman[98827]: 2025-10-14 08:58:16.756033618 +0000 UTC m=+0.091383450 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, release=1) Oct 14 04:58:16 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:58:16 localhost podman[98826]: 2025-10-14 08:58:16.797619142 +0000 UTC m=+0.132914082 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:58:16 localhost systemd[1]: tmp-crun.I0lC5y.mount: Deactivated successfully. Oct 14 04:58:16 localhost podman[98825]: 2025-10-14 08:58:16.844486158 +0000 UTC m=+0.182783221 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, version=17.1.9, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc.) Oct 14 04:58:16 localhost podman[98826]: 2025-10-14 08:58:16.849047968 +0000 UTC m=+0.184342838 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:58:16 localhost podman[98826]: unhealthy Oct 14 04:58:16 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:58:16 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:58:16 localhost podman[98825]: 2025-10-14 08:58:16.921459783 +0000 UTC m=+0.259756896 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Oct 14 04:58:16 localhost podman[98825]: unhealthy Oct 14 04:58:16 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:58:16 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:58:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:58:22 localhost systemd[1]: tmp-crun.NASjvr.mount: Deactivated successfully. Oct 14 04:58:22 localhost podman[98890]: 2025-10-14 08:58:22.765831175 +0000 UTC m=+0.106729736 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1) Oct 14 04:58:22 localhost podman[98890]: 2025-10-14 08:58:22.963987628 +0000 UTC m=+0.304886219 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1, version=17.1.9, container_name=metrics_qdr, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:58:22 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:58:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:58:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:58:37 localhost podman[98920]: 2025-10-14 08:58:37.734840654 +0000 UTC m=+0.078556875 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 04:58:37 localhost podman[98920]: 2025-10-14 08:58:37.74187814 +0000 UTC m=+0.085594341 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, release=2, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Oct 14 04:58:37 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:58:37 localhost podman[98921]: 2025-10-14 08:58:37.78626114 +0000 UTC m=+0.126955978 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Oct 14 04:58:37 localhost podman[98921]: 2025-10-14 08:58:37.817318128 +0000 UTC m=+0.158012996 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container) Oct 14 04:58:37 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:58:44 localhost podman[99037]: 2025-10-14 08:58:44.762070567 +0000 UTC m=+0.093059801 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.33.12, batch=17.1_20250721.1) Oct 14 04:58:44 localhost podman[99036]: 2025-10-14 08:58:44.810894968 +0000 UTC m=+0.143170163 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:58:44 localhost systemd[1]: tmp-crun.SSSna7.mount: Deactivated successfully. Oct 14 04:58:44 localhost podman[99044]: 2025-10-14 08:58:44.838275036 +0000 UTC m=+0.156116193 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33) Oct 14 04:58:44 localhost podman[99036]: 2025-10-14 08:58:44.880101821 +0000 UTC m=+0.212377045 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20250721.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:58:44 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:58:44 localhost podman[99038]: 2025-10-14 08:58:44.931018077 +0000 UTC m=+0.256423808 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 04:58:44 localhost podman[99038]: 2025-10-14 08:58:44.943993048 +0000 UTC m=+0.269398759 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, release=1) Oct 14 04:58:44 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:58:44 localhost podman[99044]: 2025-10-14 08:58:44.996876964 +0000 UTC m=+0.314718151 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 04:58:45 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:58:45 localhost podman[99037]: 2025-10-14 08:58:45.116898429 +0000 UTC m=+0.447887683 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, config_id=tripleo_step4, release=1) Oct 14 04:58:45 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:58:47 localhost podman[99133]: 2025-10-14 08:58:47.736098946 +0000 UTC m=+0.073710333 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ovn_controller, version=17.1.9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1) Oct 14 04:58:47 localhost systemd[1]: tmp-crun.AV1PLm.mount: Deactivated successfully. Oct 14 04:58:47 localhost podman[99134]: 2025-10-14 08:58:47.823419338 +0000 UTC m=+0.157519646 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, version=17.1.9, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12) Oct 14 04:58:47 localhost podman[99134]: 2025-10-14 08:58:47.853711966 +0000 UTC m=+0.187812254 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, release=1, container_name=nova_compute, build-date=2025-07-21T14:48:37, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Oct 14 04:58:47 localhost podman[99132]: 2025-10-14 08:58:47.86806563 +0000 UTC m=+0.204760238 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.) Oct 14 04:58:47 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:58:47 localhost podman[99133]: 2025-10-14 08:58:47.875949144 +0000 UTC m=+0.213560581 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:58:47 localhost podman[99133]: unhealthy Oct 14 04:58:47 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:58:47 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:58:47 localhost podman[99132]: 2025-10-14 08:58:47.896101048 +0000 UTC m=+0.232795696 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4) Oct 14 04:58:47 localhost podman[99132]: unhealthy Oct 14 04:58:47 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:58:47 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:58:48 localhost systemd[1]: tmp-crun.6lxH4v.mount: Deactivated successfully. Oct 14 04:58:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:58:53 localhost podman[99197]: 2025-10-14 08:58:53.743703378 +0000 UTC m=+0.082539656 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Oct 14 04:58:53 localhost podman[99197]: 2025-10-14 08:58:53.953173031 +0000 UTC m=+0.292009239 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.buildah.version=1.33.12) Oct 14 04:58:53 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:59:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:59:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:59:08 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 04:59:08 localhost recover_tripleo_nova_virtqemud[99239]: 62551 Oct 14 04:59:08 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 04:59:08 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 04:59:08 localhost systemd[1]: tmp-crun.4HX6NG.mount: Deactivated successfully. Oct 14 04:59:08 localhost podman[99226]: 2025-10-14 08:59:08.75572618 +0000 UTC m=+0.092011079 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, build-date=2025-07-21T13:04:03, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, release=2, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team) Oct 14 04:59:08 localhost podman[99226]: 2025-10-14 08:59:08.76248261 +0000 UTC m=+0.098767739 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step3, release=2, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12) Oct 14 04:59:08 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:59:08 localhost podman[99227]: 2025-10-14 08:59:08.847175451 +0000 UTC m=+0.182258842 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, managed_by=tripleo_ansible, vcs-type=git) Oct 14 04:59:08 localhost podman[99227]: 2025-10-14 08:59:08.856237781 +0000 UTC m=+0.191321132 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 04:59:08 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:59:15 localhost systemd[1]: tmp-crun.W4tTeJ.mount: Deactivated successfully. Oct 14 04:59:15 localhost podman[99268]: 2025-10-14 08:59:15.739311627 +0000 UTC m=+0.077073446 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, release=1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, version=17.1.9) Oct 14 04:59:15 localhost podman[99268]: 2025-10-14 08:59:15.775093845 +0000 UTC m=+0.112855684 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, release=1, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12) Oct 14 04:59:15 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:59:15 localhost podman[99265]: 2025-10-14 08:59:15.781225004 +0000 UTC m=+0.123735980 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 04:59:15 localhost podman[99266]: 2025-10-14 08:59:15.842269874 +0000 UTC m=+0.180667783 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.buildah.version=1.33.12) Oct 14 04:59:15 localhost podman[99265]: 2025-10-14 08:59:15.861620033 +0000 UTC m=+0.204131019 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Oct 14 04:59:15 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:59:15 localhost podman[99267]: 2025-10-14 08:59:15.925560512 +0000 UTC m=+0.258601486 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, com.redhat.component=openstack-cron-container, release=1, vcs-type=git) Oct 14 04:59:15 localhost podman[99267]: 2025-10-14 08:59:15.957514761 +0000 UTC m=+0.290555785 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git) Oct 14 04:59:15 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:59:16 localhost podman[99266]: 2025-10-14 08:59:16.221224113 +0000 UTC m=+0.559622102 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12) Oct 14 04:59:16 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:59:18 localhost podman[99356]: 2025-10-14 08:59:18.753479228 +0000 UTC m=+0.090183362 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, version=17.1.9, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 04:59:18 localhost podman[99356]: 2025-10-14 08:59:18.767454581 +0000 UTC m=+0.104158705 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, version=17.1.9, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:59:18 localhost podman[99358]: 2025-10-14 08:59:18.801894886 +0000 UTC m=+0.129258011 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:59:18 localhost podman[99356]: unhealthy Oct 14 04:59:18 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:59:18 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:59:18 localhost podman[99358]: 2025-10-14 08:59:18.833410972 +0000 UTC m=+0.160774047 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Oct 14 04:59:18 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:59:18 localhost systemd[1]: tmp-crun.rX6RzE.mount: Deactivated successfully. Oct 14 04:59:18 localhost podman[99357]: 2025-10-14 08:59:18.920651112 +0000 UTC m=+0.252141205 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 04:59:18 localhost podman[99357]: 2025-10-14 08:59:18.964336164 +0000 UTC m=+0.295826297 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64) Oct 14 04:59:18 localhost podman[99357]: unhealthy Oct 14 04:59:18 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:59:18 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:59:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:59:24 localhost podman[99423]: 2025-10-14 08:59:24.735232958 +0000 UTC m=+0.080456601 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59) Oct 14 04:59:24 localhost podman[99423]: 2025-10-14 08:59:24.977079184 +0000 UTC m=+0.322302747 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, version=17.1.9, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 04:59:24 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 04:59:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 04:59:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 04:59:39 localhost systemd[1]: tmp-crun.bqMri8.mount: Deactivated successfully. Oct 14 04:59:39 localhost podman[99452]: 2025-10-14 08:59:39.758187414 +0000 UTC m=+0.098446318 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=collectd) Oct 14 04:59:39 localhost podman[99452]: 2025-10-14 08:59:39.795185139 +0000 UTC m=+0.135444013 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, version=17.1.9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, release=2) Oct 14 04:59:39 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 04:59:39 localhost podman[99453]: 2025-10-14 08:59:39.853736031 +0000 UTC m=+0.191418435 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, distribution-scope=public, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, container_name=iscsid, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 04:59:39 localhost podman[99453]: 2025-10-14 08:59:39.892185512 +0000 UTC m=+0.229867866 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public) Oct 14 04:59:39 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 04:59:40 localhost systemd[1]: tmp-crun.Nmf4Aj.mount: Deactivated successfully. Oct 14 04:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 04:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 04:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 04:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 04:59:46 localhost systemd[1]: tmp-crun.YyP5vc.mount: Deactivated successfully. Oct 14 04:59:46 localhost podman[99570]: 2025-10-14 08:59:46.825585266 +0000 UTC m=+0.149508908 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, batch=17.1_20250721.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 04:59:46 localhost podman[99570]: 2025-10-14 08:59:46.836245646 +0000 UTC m=+0.160169308 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, name=rhosp17/openstack-cron, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, release=1) Oct 14 04:59:46 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 04:59:46 localhost podman[99568]: 2025-10-14 08:59:46.871796416 +0000 UTC m=+0.203273792 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc.) Oct 14 04:59:46 localhost podman[99568]: 2025-10-14 08:59:46.902980162 +0000 UTC m=+0.234457548 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 04:59:46 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 04:59:46 localhost podman[99569]: 2025-10-14 08:59:46.919318447 +0000 UTC m=+0.248580254 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 04:59:46 localhost podman[99571]: 2025-10-14 08:59:46.795208376 +0000 UTC m=+0.115071693 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, build-date=2025-07-21T14:45:33) Oct 14 04:59:46 localhost podman[99571]: 2025-10-14 08:59:46.979247872 +0000 UTC m=+0.299111229 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, release=1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute) Oct 14 04:59:46 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 04:59:47 localhost podman[99569]: 2025-10-14 08:59:47.280790715 +0000 UTC m=+0.610052552 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 04:59:47 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 04:59:47 localhost systemd[1]: tmp-crun.mORYnb.mount: Deactivated successfully. Oct 14 04:59:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 04:59:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 04:59:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 04:59:49 localhost systemd[1]: tmp-crun.3sWtIY.mount: Deactivated successfully. Oct 14 04:59:49 localhost podman[99662]: 2025-10-14 08:59:49.748723259 +0000 UTC m=+0.077652253 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, release=1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 04:59:49 localhost podman[99660]: 2025-10-14 08:59:49.790150582 +0000 UTC m=+0.122530404 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 04:59:49 localhost podman[99660]: 2025-10-14 08:59:49.803580618 +0000 UTC m=+0.135960430 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 04:59:49 localhost podman[99660]: unhealthy Oct 14 04:59:49 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:59:49 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 04:59:49 localhost podman[99662]: 2025-10-14 08:59:49.829724816 +0000 UTC m=+0.158653820 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, release=1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Oct 14 04:59:49 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 04:59:49 localhost podman[99661]: 2025-10-14 08:59:49.864160523 +0000 UTC m=+0.191830089 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4) Oct 14 04:59:49 localhost podman[99661]: 2025-10-14 08:59:49.901626502 +0000 UTC m=+0.229296138 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, container_name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller) Oct 14 04:59:49 localhost podman[99661]: unhealthy Oct 14 04:59:49 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 04:59:49 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 04:59:50 localhost systemd[1]: tmp-crun.iQ4UTl.mount: Deactivated successfully. Oct 14 04:59:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 04:59:55 localhost podman[99723]: 2025-10-14 08:59:55.749416665 +0000 UTC m=+0.088308654 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1, io.buildah.version=1.33.12, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 04:59:55 localhost podman[99723]: 2025-10-14 08:59:55.941205352 +0000 UTC m=+0.280097341 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd) Oct 14 04:59:55 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:00:10 localhost systemd[1]: tmp-crun.76WUkk.mount: Deactivated successfully. Oct 14 05:00:10 localhost podman[99755]: 2025-10-14 09:00:10.742309128 +0000 UTC m=+0.080694588 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, build-date=2025-07-21T13:04:03, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1) Oct 14 05:00:10 localhost podman[99756]: 2025-10-14 09:00:10.724819006 +0000 UTC m=+0.065662902 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, build-date=2025-07-21T13:27:15, container_name=iscsid, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9) Oct 14 05:00:10 localhost podman[99756]: 2025-10-14 09:00:10.804487543 +0000 UTC m=+0.145331449 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, container_name=iscsid) Oct 14 05:00:10 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:00:10 localhost podman[99755]: 2025-10-14 09:00:10.828254688 +0000 UTC m=+0.166640148 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.9, release=2, architecture=x86_64, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1) Oct 14 05:00:10 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:00:17 localhost systemd[1]: tmp-crun.zYmFts.mount: Deactivated successfully. Oct 14 05:00:17 localhost systemd[1]: tmp-crun.bXw0wD.mount: Deactivated successfully. Oct 14 05:00:17 localhost podman[99795]: 2025-10-14 09:00:17.763024997 +0000 UTC m=+0.102139443 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, release=1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, architecture=x86_64) Oct 14 05:00:17 localhost podman[99797]: 2025-10-14 09:00:17.813886621 +0000 UTC m=+0.141414599 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52) Oct 14 05:00:17 localhost podman[99802]: 2025-10-14 09:00:17.792319413 +0000 UTC m=+0.116539068 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:00:17 localhost podman[99795]: 2025-10-14 09:00:17.843784015 +0000 UTC m=+0.182898481 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, architecture=x86_64, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, io.buildah.version=1.33.12) Oct 14 05:00:17 localhost podman[99802]: 2025-10-14 09:00:17.88303441 +0000 UTC m=+0.207254025 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, release=1, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:00:17 localhost podman[99796]: 2025-10-14 09:00:17.892449412 +0000 UTC m=+0.223517879 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, architecture=x86_64) Oct 14 05:00:17 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:00:17 localhost podman[99797]: 2025-10-14 09:00:17.911051847 +0000 UTC m=+0.238579805 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1) Oct 14 05:00:17 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:00:17 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:00:18 localhost podman[99796]: 2025-10-14 09:00:18.21902443 +0000 UTC m=+0.550092907 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.9, io.buildah.version=1.33.12, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container) Oct 14 05:00:18 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:00:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:00:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:00:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:00:20 localhost systemd[1]: tmp-crun.hFcvQC.mount: Deactivated successfully. Oct 14 05:00:20 localhost podman[99888]: 2025-10-14 09:00:20.713408563 +0000 UTC m=+0.057010516 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.) Oct 14 05:00:20 localhost podman[99890]: 2025-10-14 09:00:20.774106851 +0000 UTC m=+0.111061628 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:00:20 localhost podman[99888]: 2025-10-14 09:00:20.793551123 +0000 UTC m=+0.137153106 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4) Oct 14 05:00:20 localhost podman[99888]: unhealthy Oct 14 05:00:20 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:00:20 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:00:20 localhost podman[99890]: 2025-10-14 09:00:20.847615197 +0000 UTC m=+0.184570004 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, architecture=x86_64, release=1) Oct 14 05:00:20 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:00:20 localhost podman[99889]: 2025-10-14 09:00:20.746943671 +0000 UTC m=+0.085442816 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.33.12, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44) Oct 14 05:00:20 localhost podman[99889]: 2025-10-14 09:00:20.928024385 +0000 UTC m=+0.266523510 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, architecture=x86_64, io.openshift.expose-services=) Oct 14 05:00:20 localhost podman[99889]: unhealthy Oct 14 05:00:20 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:00:20 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:00:21 localhost systemd[1]: tmp-crun.T2dyxK.mount: Deactivated successfully. Oct 14 05:00:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:00:26 localhost systemd[1]: tmp-crun.hi4OvI.mount: Deactivated successfully. Oct 14 05:00:26 localhost podman[99953]: 2025-10-14 09:00:26.748155233 +0000 UTC m=+0.093082662 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, version=17.1.9, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1) Oct 14 05:00:26 localhost podman[99953]: 2025-10-14 09:00:26.945341556 +0000 UTC m=+0.290269125 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:00:26 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:00:39 localhost systemd[1]: session-28.scope: Deactivated successfully. Oct 14 05:00:39 localhost systemd[1]: session-28.scope: Consumed 7min 15.099s CPU time. Oct 14 05:00:39 localhost systemd-logind[760]: Session 28 logged out. Waiting for processes to exit. Oct 14 05:00:39 localhost systemd-logind[760]: Removed session 28. Oct 14 05:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:00:41 localhost podman[99983]: 2025-10-14 09:00:41.732575061 +0000 UTC m=+0.079450701 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, container_name=collectd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, release=2) Oct 14 05:00:41 localhost podman[99983]: 2025-10-14 09:00:41.740030831 +0000 UTC m=+0.086906461 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12) Oct 14 05:00:41 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:00:41 localhost systemd[1]: tmp-crun.kHjAvi.mount: Deactivated successfully. Oct 14 05:00:41 localhost podman[99984]: 2025-10-14 09:00:41.784392144 +0000 UTC m=+0.128266880 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.buildah.version=1.33.12, batch=17.1_20250721.1, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, container_name=iscsid, maintainer=OpenStack TripleO Team) Oct 14 05:00:41 localhost podman[99984]: 2025-10-14 09:00:41.794844868 +0000 UTC m=+0.138719604 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, container_name=iscsid) Oct 14 05:00:41 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:00:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:00:48 localhost recover_tripleo_nova_virtqemud[100124]: 62551 Oct 14 05:00:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:00:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:00:48 localhost systemd[1]: tmp-crun.tIwe6t.mount: Deactivated successfully. Oct 14 05:00:48 localhost systemd[1]: tmp-crun.MHzpCs.mount: Deactivated successfully. Oct 14 05:00:48 localhost podman[100097]: 2025-10-14 09:00:48.775245786 +0000 UTC m=+0.105145325 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 05:00:48 localhost podman[100098]: 2025-10-14 09:00:48.833075176 +0000 UTC m=+0.157939359 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Oct 14 05:00:48 localhost podman[100097]: 2025-10-14 09:00:48.860099122 +0000 UTC m=+0.189998661 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc.) Oct 14 05:00:48 localhost podman[100100]: 2025-10-14 09:00:48.816744571 +0000 UTC m=+0.139681904 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, release=1, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20250721.1) Oct 14 05:00:48 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:00:48 localhost podman[100100]: 2025-10-14 09:00:48.900156492 +0000 UTC m=+0.223093875 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, version=17.1.9, container_name=ceilometer_agent_compute, tcib_managed=true) Oct 14 05:00:48 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:00:48 localhost podman[100099]: 2025-10-14 09:00:48.901886666 +0000 UTC m=+0.230593758 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-cron, version=17.1.9, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible) Oct 14 05:00:48 localhost podman[100099]: 2025-10-14 09:00:48.981374006 +0000 UTC m=+0.310081078 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, release=1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52) Oct 14 05:00:48 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:00:49 localhost podman[100098]: 2025-10-14 09:00:49.213105278 +0000 UTC m=+0.537969431 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37) Oct 14 05:00:49 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:00:49 localhost systemd[1]: Stopping User Manager for UID 1003... Oct 14 05:00:49 localhost systemd[35868]: Activating special unit Exit the Session... Oct 14 05:00:49 localhost systemd[35868]: Removed slice User Background Tasks Slice. Oct 14 05:00:49 localhost systemd[35868]: Stopped target Main User Target. Oct 14 05:00:49 localhost systemd[35868]: Stopped target Basic System. Oct 14 05:00:49 localhost systemd[35868]: Stopped target Paths. Oct 14 05:00:49 localhost systemd[35868]: Stopped target Sockets. Oct 14 05:00:49 localhost systemd[35868]: Stopped target Timers. Oct 14 05:00:49 localhost systemd[35868]: Stopped Mark boot as successful after the user session has run 2 minutes. Oct 14 05:00:49 localhost systemd[35868]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 05:00:49 localhost systemd[35868]: Closed D-Bus User Message Bus Socket. Oct 14 05:00:49 localhost systemd[35868]: Stopped Create User's Volatile Files and Directories. Oct 14 05:00:49 localhost systemd[35868]: Removed slice User Application Slice. Oct 14 05:00:49 localhost systemd[35868]: Reached target Shutdown. Oct 14 05:00:49 localhost systemd[35868]: Finished Exit the Session. Oct 14 05:00:49 localhost systemd[35868]: Reached target Exit the Session. Oct 14 05:00:49 localhost systemd[1]: user@1003.service: Deactivated successfully. Oct 14 05:00:49 localhost systemd[1]: Stopped User Manager for UID 1003. Oct 14 05:00:49 localhost systemd[1]: user@1003.service: Consumed 4.934s CPU time, read 0B from disk, written 7.0K to disk. Oct 14 05:00:49 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Oct 14 05:00:49 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Oct 14 05:00:49 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Oct 14 05:00:49 localhost systemd[1]: Removed slice User Slice of UID 1003. Oct 14 05:00:49 localhost systemd[1]: user-1003.slice: Consumed 7min 20.066s CPU time. Oct 14 05:00:49 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Oct 14 05:00:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:00:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:00:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:00:51 localhost systemd[1]: tmp-crun.QKXRw5.mount: Deactivated successfully. Oct 14 05:00:51 localhost podman[100199]: 2025-10-14 09:00:51.762292058 +0000 UTC m=+0.091869014 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:00:51 localhost podman[100197]: 2025-10-14 09:00:51.736526381 +0000 UTC m=+0.074715604 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.buildah.version=1.33.12, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, release=1) Oct 14 05:00:51 localhost podman[100199]: 2025-10-14 09:00:51.842076368 +0000 UTC m=+0.171653304 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, version=17.1.9, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37) Oct 14 05:00:51 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:00:51 localhost podman[100197]: 2025-10-14 09:00:51.872484938 +0000 UTC m=+0.210674191 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, build-date=2025-07-21T16:28:53, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 05:00:51 localhost podman[100197]: unhealthy Oct 14 05:00:51 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:00:51 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:00:51 localhost podman[100198]: 2025-10-14 09:00:51.8434417 +0000 UTC m=+0.177242947 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:00:51 localhost podman[100198]: 2025-10-14 09:00:51.927279175 +0000 UTC m=+0.261080402 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller) Oct 14 05:00:51 localhost podman[100198]: unhealthy Oct 14 05:00:51 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:00:51 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:00:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:00:57 localhost podman[100259]: 2025-10-14 09:00:57.742867323 +0000 UTC m=+0.087299083 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true) Oct 14 05:00:57 localhost podman[100259]: 2025-10-14 09:00:57.973183221 +0000 UTC m=+0.317614911 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=17.1.9, config_id=tripleo_step1) Oct 14 05:00:57 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:01:12 localhost systemd[1]: tmp-crun.VK4Kcl.mount: Deactivated successfully. Oct 14 05:01:12 localhost podman[100316]: 2025-10-14 09:01:12.760550423 +0000 UTC m=+0.096458437 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 05:01:12 localhost podman[100316]: 2025-10-14 09:01:12.806108263 +0000 UTC m=+0.142016277 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, distribution-scope=public, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd) Oct 14 05:01:12 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:01:12 localhost podman[100317]: 2025-10-14 09:01:12.807063792 +0000 UTC m=+0.141462229 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:01:12 localhost podman[100317]: 2025-10-14 09:01:12.88774773 +0000 UTC m=+0.222146147 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, version=17.1.9, io.openshift.expose-services=, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=1) Oct 14 05:01:12 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:01:19 localhost podman[100356]: 2025-10-14 09:01:19.74216235 +0000 UTC m=+0.086197139 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git) Oct 14 05:01:19 localhost podman[100356]: 2025-10-14 09:01:19.792946932 +0000 UTC m=+0.136981731 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi) Oct 14 05:01:19 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:01:19 localhost podman[100364]: 2025-10-14 09:01:19.792591211 +0000 UTC m=+0.126166326 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, release=1) Oct 14 05:01:19 localhost podman[100357]: 2025-10-14 09:01:19.855117646 +0000 UTC m=+0.192368604 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=nova_migration_target, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1) Oct 14 05:01:19 localhost podman[100358]: 2025-10-14 09:01:19.908099276 +0000 UTC m=+0.241934469 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, release=1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1) Oct 14 05:01:19 localhost podman[100358]: 2025-10-14 09:01:19.916055032 +0000 UTC m=+0.249890285 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-cron, version=17.1.9, build-date=2025-07-21T13:07:52, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Oct 14 05:01:19 localhost podman[100364]: 2025-10-14 09:01:19.927108995 +0000 UTC m=+0.260684090 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:45:33, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=) Oct 14 05:01:19 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:01:19 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:01:20 localhost podman[100357]: 2025-10-14 09:01:20.209789514 +0000 UTC m=+0.547040472 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, architecture=x86_64) Oct 14 05:01:20 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:01:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:01:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:01:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:01:22 localhost podman[100449]: 2025-10-14 09:01:22.748745767 +0000 UTC m=+0.077296713 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=) Oct 14 05:01:22 localhost podman[100449]: 2025-10-14 09:01:22.796172715 +0000 UTC m=+0.124723681 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:01:22 localhost systemd[1]: tmp-crun.eVBMsm.mount: Deactivated successfully. Oct 14 05:01:22 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:01:22 localhost podman[100448]: 2025-10-14 09:01:22.809419935 +0000 UTC m=+0.137877678 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:01:22 localhost podman[100448]: 2025-10-14 09:01:22.823456799 +0000 UTC m=+0.151914542 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.buildah.version=1.33.12, managed_by=tripleo_ansible) Oct 14 05:01:22 localhost podman[100448]: unhealthy Oct 14 05:01:22 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:01:22 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:01:22 localhost podman[100447]: 2025-10-14 09:01:22.901471854 +0000 UTC m=+0.233105646 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, release=1, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53) Oct 14 05:01:22 localhost podman[100447]: 2025-10-14 09:01:22.914262749 +0000 UTC m=+0.245896491 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team) Oct 14 05:01:22 localhost podman[100447]: unhealthy Oct 14 05:01:22 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:01:22 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:01:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:01:28 localhost podman[100515]: 2025-10-14 09:01:28.739886859 +0000 UTC m=+0.079749370 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64) Oct 14 05:01:28 localhost podman[100515]: 2025-10-14 09:01:28.964998706 +0000 UTC m=+0.304861137 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Oct 14 05:01:28 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:01:43 localhost podman[100543]: 2025-10-14 09:01:43.724183616 +0000 UTC m=+0.065735936 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_step3, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Oct 14 05:01:43 localhost systemd[1]: tmp-crun.vEebNP.mount: Deactivated successfully. Oct 14 05:01:43 localhost podman[100542]: 2025-10-14 09:01:43.790479247 +0000 UTC m=+0.132413859 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, container_name=collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:01:43 localhost podman[100542]: 2025-10-14 09:01:43.802147148 +0000 UTC m=+0.144081800 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, version=17.1.9, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, tcib_managed=true, release=2, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container) Oct 14 05:01:43 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:01:43 localhost podman[100543]: 2025-10-14 09:01:43.85709985 +0000 UTC m=+0.198652240 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.9) Oct 14 05:01:43 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:01:50 localhost systemd[1]: tmp-crun.qR7erl.mount: Deactivated successfully. Oct 14 05:01:50 localhost podman[100662]: 2025-10-14 09:01:50.769884788 +0000 UTC m=+0.100923605 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T14:45:33, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1) Oct 14 05:01:50 localhost podman[100659]: 2025-10-14 09:01:50.810133213 +0000 UTC m=+0.145395301 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12) Oct 14 05:01:50 localhost podman[100660]: 2025-10-14 09:01:50.861820563 +0000 UTC m=+0.197297078 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true) Oct 14 05:01:50 localhost podman[100659]: 2025-10-14 09:01:50.867166208 +0000 UTC m=+0.202428296 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4) Oct 14 05:01:50 localhost podman[100662]: 2025-10-14 09:01:50.877415415 +0000 UTC m=+0.208454292 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc.) Oct 14 05:01:50 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:01:50 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:01:50 localhost podman[100661]: 2025-10-14 09:01:50.96223849 +0000 UTC m=+0.297605222 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, vcs-type=git, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Oct 14 05:01:50 localhost podman[100661]: 2025-10-14 09:01:50.970011781 +0000 UTC m=+0.305378503 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9) Oct 14 05:01:50 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:01:51 localhost podman[100660]: 2025-10-14 09:01:51.228031317 +0000 UTC m=+0.563507812 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12) Oct 14 05:01:51 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:01:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:01:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:01:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:01:53 localhost systemd[1]: tmp-crun.3oX4NZ.mount: Deactivated successfully. Oct 14 05:01:53 localhost podman[100751]: 2025-10-14 09:01:53.753475182 +0000 UTC m=+0.085908001 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.9, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:01:53 localhost podman[100751]: 2025-10-14 09:01:53.788037621 +0000 UTC m=+0.120470500 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 05:01:53 localhost podman[100749]: 2025-10-14 09:01:53.804995036 +0000 UTC m=+0.141820280 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, container_name=ovn_metadata_agent) Oct 14 05:01:53 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:01:53 localhost podman[100749]: 2025-10-14 09:01:53.842542439 +0000 UTC m=+0.179367683 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git) Oct 14 05:01:53 localhost podman[100749]: unhealthy Oct 14 05:01:53 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:01:53 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:01:53 localhost podman[100750]: 2025-10-14 09:01:53.897999874 +0000 UTC m=+0.235967564 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, release=1, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:01:53 localhost podman[100750]: 2025-10-14 09:01:53.915988752 +0000 UTC m=+0.253956472 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.9) Oct 14 05:01:53 localhost podman[100750]: unhealthy Oct 14 05:01:53 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:01:53 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:01:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:01:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:01:59 localhost recover_tripleo_nova_virtqemud[100814]: 62551 Oct 14 05:01:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:01:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:01:59 localhost systemd[1]: tmp-crun.8NFFPi.mount: Deactivated successfully. Oct 14 05:01:59 localhost podman[100812]: 2025-10-14 09:01:59.751924529 +0000 UTC m=+0.076660094 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:07:59, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=) Oct 14 05:01:59 localhost podman[100812]: 2025-10-14 09:01:59.983106914 +0000 UTC m=+0.307842539 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, architecture=x86_64, release=1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 05:01:59 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:02:14 localhost systemd[1]: tmp-crun.lJc1TS.mount: Deactivated successfully. Oct 14 05:02:14 localhost podman[100843]: 2025-10-14 09:02:14.752760545 +0000 UTC m=+0.097851939 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, batch=17.1_20250721.1, release=2, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 05:02:14 localhost podman[100844]: 2025-10-14 09:02:14.788802431 +0000 UTC m=+0.131062768 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, release=1) Oct 14 05:02:14 localhost podman[100843]: 2025-10-14 09:02:14.812153754 +0000 UTC m=+0.157245208 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, architecture=x86_64, release=2, tcib_managed=true, container_name=collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, distribution-scope=public) Oct 14 05:02:14 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:02:14 localhost podman[100844]: 2025-10-14 09:02:14.82496045 +0000 UTC m=+0.167220807 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git) Oct 14 05:02:14 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:02:21 localhost systemd[1]: tmp-crun.RE0lmx.mount: Deactivated successfully. Oct 14 05:02:21 localhost podman[100883]: 2025-10-14 09:02:21.752405599 +0000 UTC m=+0.090258064 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:02:21 localhost podman[100883]: 2025-10-14 09:02:21.800970683 +0000 UTC m=+0.138823138 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi) Oct 14 05:02:21 localhost podman[100891]: 2025-10-14 09:02:21.80025915 +0000 UTC m=+0.126875477 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, container_name=ceilometer_agent_compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, vcs-type=git) Oct 14 05:02:21 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:02:21 localhost podman[100884]: 2025-10-14 09:02:21.812385597 +0000 UTC m=+0.142911965 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.buildah.version=1.33.12, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:02:21 localhost podman[100885]: 2025-10-14 09:02:21.849231457 +0000 UTC m=+0.177224066 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public) Oct 14 05:02:21 localhost podman[100891]: 2025-10-14 09:02:21.882998582 +0000 UTC m=+0.209614919 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, release=1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4) Oct 14 05:02:21 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:02:21 localhost podman[100885]: 2025-10-14 09:02:21.934161665 +0000 UTC m=+0.262154274 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1, batch=17.1_20250721.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.33.12, config_id=tripleo_step4) Oct 14 05:02:21 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:02:22 localhost podman[100884]: 2025-10-14 09:02:22.117238572 +0000 UTC m=+0.447764960 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, distribution-scope=public, io.buildah.version=1.33.12, release=1) Oct 14 05:02:22 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:02:22 localhost systemd[1]: tmp-crun.vfY0lh.mount: Deactivated successfully. Oct 14 05:02:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:02:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:02:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:02:24 localhost podman[100979]: 2025-10-14 09:02:24.749486432 +0000 UTC m=+0.089998017 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, architecture=x86_64, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12) Oct 14 05:02:24 localhost podman[100979]: 2025-10-14 09:02:24.791393639 +0000 UTC m=+0.131905294 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, release=1, version=17.1.9, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc.) Oct 14 05:02:24 localhost podman[100979]: unhealthy Oct 14 05:02:24 localhost systemd[1]: tmp-crun.sHXzyr.mount: Deactivated successfully. Oct 14 05:02:24 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:02:24 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:02:24 localhost podman[100981]: 2025-10-14 09:02:24.816024631 +0000 UTC m=+0.149455396 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1) Oct 14 05:02:24 localhost podman[100980]: 2025-10-14 09:02:24.855292537 +0000 UTC m=+0.192224341 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., tcib_managed=true, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:02:24 localhost podman[100981]: 2025-10-14 09:02:24.896265845 +0000 UTC m=+0.229696610 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:02:24 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:02:24 localhost podman[100980]: 2025-10-14 09:02:24.920780193 +0000 UTC m=+0.257712037 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, managed_by=tripleo_ansible) Oct 14 05:02:24 localhost podman[100980]: unhealthy Oct 14 05:02:24 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:02:24 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:02:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:02:30 localhost podman[101044]: 2025-10-14 09:02:30.75473507 +0000 UTC m=+0.096585751 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, name=rhosp17/openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=) Oct 14 05:02:30 localhost podman[101044]: 2025-10-14 09:02:30.98835508 +0000 UTC m=+0.330205811 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Oct 14 05:02:31 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:02:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:02:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4956 writes, 22K keys, 4956 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4956 writes, 647 syncs, 7.66 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4 writes, 8 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:02:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:02:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5547 writes, 24K keys, 5547 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5547 writes, 761 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4 writes, 8 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:02:45 localhost podman[101074]: 2025-10-14 09:02:45.761191582 +0000 UTC m=+0.099595434 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:02:45 localhost podman[101074]: 2025-10-14 09:02:45.796748132 +0000 UTC m=+0.135152024 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:02:45 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:02:45 localhost podman[101073]: 2025-10-14 09:02:45.798454615 +0000 UTC m=+0.137872618 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-collectd-container, vcs-type=git) Oct 14 05:02:45 localhost podman[101073]: 2025-10-14 09:02:45.879743151 +0000 UTC m=+0.219161164 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Oct 14 05:02:45 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:02:52 localhost systemd[1]: tmp-crun.kYMVun.mount: Deactivated successfully. Oct 14 05:02:52 localhost podman[101190]: 2025-10-14 09:02:52.736715171 +0000 UTC m=+0.076593542 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi) Oct 14 05:02:52 localhost systemd[1]: tmp-crun.NuwF6k.mount: Deactivated successfully. Oct 14 05:02:52 localhost podman[101193]: 2025-10-14 09:02:52.782117736 +0000 UTC m=+0.110455180 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, distribution-scope=public, release=1, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9) Oct 14 05:02:52 localhost podman[101190]: 2025-10-14 09:02:52.789182114 +0000 UTC m=+0.129060505 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:02:52 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:02:52 localhost podman[101192]: 2025-10-14 09:02:52.752735926 +0000 UTC m=+0.084156295 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T13:07:52, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron) Oct 14 05:02:52 localhost podman[101191]: 2025-10-14 09:02:52.804548991 +0000 UTC m=+0.138448287 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37) Oct 14 05:02:52 localhost podman[101193]: 2025-10-14 09:02:52.865057504 +0000 UTC m=+0.193394948 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, vcs-type=git, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible) Oct 14 05:02:52 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:02:52 localhost podman[101192]: 2025-10-14 09:02:52.887212159 +0000 UTC m=+0.218632528 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, distribution-scope=public, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1) Oct 14 05:02:52 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:02:53 localhost podman[101191]: 2025-10-14 09:02:53.134223685 +0000 UTC m=+0.468123011 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-07-21T14:48:37, release=1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64) Oct 14 05:02:53 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:02:55 localhost systemd[1]: tmp-crun.Qr0TwB.mount: Deactivated successfully. Oct 14 05:02:55 localhost podman[101281]: 2025-10-14 09:02:55.741932035 +0000 UTC m=+0.083362840 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 05:02:55 localhost podman[101281]: 2025-10-14 09:02:55.755431123 +0000 UTC m=+0.096861978 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 05:02:55 localhost podman[101281]: unhealthy Oct 14 05:02:55 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:02:55 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:02:55 localhost podman[101286]: 2025-10-14 09:02:55.802232122 +0000 UTC m=+0.129776097 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Oct 14 05:02:55 localhost podman[101286]: 2025-10-14 09:02:55.828567167 +0000 UTC m=+0.156111142 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 05:02:55 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:02:55 localhost podman[101282]: 2025-10-14 09:02:55.847257985 +0000 UTC m=+0.181707845 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, release=1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team) Oct 14 05:02:55 localhost podman[101282]: 2025-10-14 09:02:55.890327268 +0000 UTC m=+0.224777078 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git) Oct 14 05:02:55 localhost podman[101282]: unhealthy Oct 14 05:02:55 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:02:55 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:03:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:03:01 localhost podman[101350]: 2025-10-14 09:03:01.749727512 +0000 UTC m=+0.082759113 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, version=17.1.9, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64) Oct 14 05:03:01 localhost podman[101350]: 2025-10-14 09:03:01.987068688 +0000 UTC m=+0.320100249 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9) Oct 14 05:03:01 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:03:16 localhost systemd[1]: tmp-crun.cNaGUn.mount: Deactivated successfully. Oct 14 05:03:16 localhost podman[101378]: 2025-10-14 09:03:16.747548518 +0000 UTC m=+0.085295562 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, release=2, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Oct 14 05:03:16 localhost podman[101378]: 2025-10-14 09:03:16.762965244 +0000 UTC m=+0.100712368 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:03:16 localhost podman[101379]: 2025-10-14 09:03:16.801697763 +0000 UTC m=+0.138403814 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.33.12, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, container_name=iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, tcib_managed=true, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:03:16 localhost podman[101379]: 2025-10-14 09:03:16.812304402 +0000 UTC m=+0.149010483 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:03:16 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:03:16 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:03:23 localhost systemd[1]: tmp-crun.vsF79i.mount: Deactivated successfully. Oct 14 05:03:23 localhost podman[101419]: 2025-10-14 09:03:23.74293499 +0000 UTC m=+0.075534029 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9) Oct 14 05:03:23 localhost podman[101420]: 2025-10-14 09:03:23.798257172 +0000 UTC m=+0.127180627 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, container_name=logrotate_crond, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 05:03:23 localhost podman[101418]: 2025-10-14 09:03:23.858516977 +0000 UTC m=+0.193002665 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:03:23 localhost podman[101421]: 2025-10-14 09:03:23.777877081 +0000 UTC m=+0.102123031 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T14:45:33, architecture=x86_64, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute) Oct 14 05:03:23 localhost podman[101420]: 2025-10-14 09:03:23.881810368 +0000 UTC m=+0.210733853 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, managed_by=tripleo_ansible) Oct 14 05:03:23 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:03:23 localhost podman[101421]: 2025-10-14 09:03:23.911137686 +0000 UTC m=+0.235383666 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:03:23 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:03:23 localhost podman[101418]: 2025-10-14 09:03:23.931135445 +0000 UTC m=+0.265621133 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 05:03:23 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:03:24 localhost podman[101419]: 2025-10-14 09:03:24.120136514 +0000 UTC m=+0.452735573 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., release=1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12) Oct 14 05:03:24 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:03:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:03:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:03:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:03:26 localhost systemd[1]: tmp-crun.s1O9B2.mount: Deactivated successfully. Oct 14 05:03:26 localhost podman[101516]: 2025-10-14 09:03:26.753290493 +0000 UTC m=+0.089418058 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:03:26 localhost podman[101515]: 2025-10-14 09:03:26.788927537 +0000 UTC m=+0.127145556 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, tcib_managed=true) Oct 14 05:03:26 localhost podman[101516]: 2025-10-14 09:03:26.796242754 +0000 UTC m=+0.132370279 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1) Oct 14 05:03:26 localhost podman[101516]: unhealthy Oct 14 05:03:26 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:03:26 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:03:26 localhost podman[101517]: 2025-10-14 09:03:26.770701553 +0000 UTC m=+0.101747281 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_compute) Oct 14 05:03:26 localhost podman[101515]: 2025-10-14 09:03:26.824310252 +0000 UTC m=+0.162528271 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, release=1) Oct 14 05:03:26 localhost podman[101515]: unhealthy Oct 14 05:03:26 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:03:26 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:03:26 localhost podman[101517]: 2025-10-14 09:03:26.849807031 +0000 UTC m=+0.180852709 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, name=rhosp17/openstack-nova-compute, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9) Oct 14 05:03:26 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:03:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:03:32 localhost podman[101576]: 2025-10-14 09:03:32.725042314 +0000 UTC m=+0.071657519 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, vcs-type=git, config_id=tripleo_step1) Oct 14 05:03:32 localhost podman[101576]: 2025-10-14 09:03:32.924661942 +0000 UTC m=+0.271277107 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:03:32 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:03:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:03:47 localhost recover_tripleo_nova_virtqemud[101617]: 62551 Oct 14 05:03:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:03:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:03:47 localhost podman[101605]: 2025-10-14 09:03:47.754590479 +0000 UTC m=+0.089244914 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1) Oct 14 05:03:47 localhost podman[101605]: 2025-10-14 09:03:47.789186709 +0000 UTC m=+0.123841114 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1, container_name=iscsid, version=17.1.9) Oct 14 05:03:47 localhost podman[101604]: 2025-10-14 09:03:47.803546724 +0000 UTC m=+0.139394295 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, batch=17.1_20250721.1, vcs-type=git, version=17.1.9, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Oct 14 05:03:47 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:03:47 localhost podman[101604]: 2025-10-14 09:03:47.817133184 +0000 UTC m=+0.152980775 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:03:47 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:03:54 localhost systemd[1]: tmp-crun.f2oEnw.mount: Deactivated successfully. Oct 14 05:03:54 localhost podman[101722]: 2025-10-14 09:03:54.841889107 +0000 UTC m=+0.146962199 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, version=17.1.9, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target) Oct 14 05:03:54 localhost podman[101721]: 2025-10-14 09:03:54.803785357 +0000 UTC m=+0.108747756 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 05:03:54 localhost podman[101721]: 2025-10-14 09:03:54.883887436 +0000 UTC m=+0.188849905 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:03:54 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:03:54 localhost podman[101724]: 2025-10-14 09:03:54.913210175 +0000 UTC m=+0.210310792 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git) Oct 14 05:03:54 localhost podman[101724]: 2025-10-14 09:03:54.953101159 +0000 UTC m=+0.250201756 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, tcib_managed=true) Oct 14 05:03:54 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:03:54 localhost podman[101723]: 2025-10-14 09:03:54.993614333 +0000 UTC m=+0.296800518 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-07-21T13:07:52, version=17.1.9, io.buildah.version=1.33.12) Oct 14 05:03:55 localhost podman[101723]: 2025-10-14 09:03:55.027658886 +0000 UTC m=+0.330845031 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, batch=17.1_20250721.1) Oct 14 05:03:55 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:03:55 localhost podman[101722]: 2025-10-14 09:03:55.218838494 +0000 UTC m=+0.523911556 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.12) Oct 14 05:03:55 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:03:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:03:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:03:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:03:57 localhost podman[101811]: 2025-10-14 09:03:57.749729678 +0000 UTC m=+0.088370586 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:03:57 localhost podman[101811]: 2025-10-14 09:03:57.760749749 +0000 UTC m=+0.099390667 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, tcib_managed=true, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, batch=17.1_20250721.1) Oct 14 05:03:57 localhost podman[101811]: unhealthy Oct 14 05:03:57 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:03:57 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:03:57 localhost podman[101813]: 2025-10-14 09:03:57.819570029 +0000 UTC m=+0.154143471 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, container_name=nova_compute, name=rhosp17/openstack-nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible) Oct 14 05:03:57 localhost podman[101813]: 2025-10-14 09:03:57.877110311 +0000 UTC m=+0.211683723 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=) Oct 14 05:03:57 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:03:57 localhost systemd[1]: tmp-crun.sbdJhI.mount: Deactivated successfully. Oct 14 05:03:57 localhost podman[101812]: 2025-10-14 09:03:57.974000439 +0000 UTC m=+0.310947025 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1) Oct 14 05:03:58 localhost podman[101812]: 2025-10-14 09:03:58.015342059 +0000 UTC m=+0.352288685 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, version=17.1.9) Oct 14 05:03:58 localhost podman[101812]: unhealthy Oct 14 05:03:58 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:03:58 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:04:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:04:03 localhost podman[101877]: 2025-10-14 09:04:03.736772112 +0000 UTC m=+0.079404828 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59) Oct 14 05:04:03 localhost podman[101877]: 2025-10-14 09:04:03.924247515 +0000 UTC m=+0.266880251 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9) Oct 14 05:04:03 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:04:18 localhost systemd[1]: tmp-crun.w7ksYi.mount: Deactivated successfully. Oct 14 05:04:18 localhost podman[101905]: 2025-10-14 09:04:18.7440898 +0000 UTC m=+0.082024880 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12) Oct 14 05:04:18 localhost systemd[1]: tmp-crun.LF9oCD.mount: Deactivated successfully. Oct 14 05:04:18 localhost podman[101905]: 2025-10-14 09:04:18.766650168 +0000 UTC m=+0.104585248 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd) Oct 14 05:04:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:04:18 localhost podman[101906]: 2025-10-14 09:04:18.773211672 +0000 UTC m=+0.104902099 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12) Oct 14 05:04:18 localhost podman[101906]: 2025-10-14 09:04:18.857132369 +0000 UTC m=+0.188822746 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.33.12, architecture=x86_64, container_name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1, tcib_managed=true) Oct 14 05:04:18 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:04:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:04:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:04:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:04:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:04:25 localhost systemd[1]: tmp-crun.0n3KMm.mount: Deactivated successfully. Oct 14 05:04:25 localhost podman[101943]: 2025-10-14 09:04:25.77389391 +0000 UTC m=+0.111192863 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1) Oct 14 05:04:25 localhost podman[101943]: 2025-10-14 09:04:25.800132412 +0000 UTC m=+0.137431405 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible) Oct 14 05:04:25 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:04:25 localhost podman[101945]: 2025-10-14 09:04:25.823739322 +0000 UTC m=+0.150739906 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron) Oct 14 05:04:25 localhost podman[101945]: 2025-10-14 09:04:25.860312805 +0000 UTC m=+0.187313429 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, release=1, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4) Oct 14 05:04:25 localhost podman[101944]: 2025-10-14 09:04:25.869236031 +0000 UTC m=+0.198733662 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T14:48:37, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target) Oct 14 05:04:25 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:04:25 localhost podman[101950]: 2025-10-14 09:04:25.927020679 +0000 UTC m=+0.247535882 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.9, container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:04:25 localhost podman[101950]: 2025-10-14 09:04:25.960097033 +0000 UTC m=+0.280612296 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, architecture=x86_64, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1, release=1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 05:04:25 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:04:26 localhost podman[101944]: 2025-10-14 09:04:26.244034021 +0000 UTC m=+0.573531712 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:04:26 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:04:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:04:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:04:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:04:28 localhost podman[102035]: 2025-10-14 09:04:28.758853577 +0000 UTC m=+0.090083769 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller) Oct 14 05:04:28 localhost podman[102035]: 2025-10-14 09:04:28.798318438 +0000 UTC m=+0.129548590 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, config_id=tripleo_step4, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:04:28 localhost podman[102035]: unhealthy Oct 14 05:04:28 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:04:28 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:04:28 localhost podman[102034]: 2025-10-14 09:04:28.799737933 +0000 UTC m=+0.133653128 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 05:04:28 localhost podman[102036]: 2025-10-14 09:04:28.862389112 +0000 UTC m=+0.191318753 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, vcs-type=git, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, config_id=tripleo_step5, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:04:28 localhost podman[102034]: 2025-10-14 09:04:28.880084679 +0000 UTC m=+0.213999864 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent) Oct 14 05:04:28 localhost podman[102034]: unhealthy Oct 14 05:04:28 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:04:28 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:04:28 localhost podman[102036]: 2025-10-14 09:04:28.897148988 +0000 UTC m=+0.226078619 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9) Oct 14 05:04:28 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:04:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:04:34 localhost podman[102099]: 2025-10-14 09:04:34.743539901 +0000 UTC m=+0.087114148 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step1, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 05:04:35 localhost podman[102099]: 2025-10-14 09:04:35.012112523 +0000 UTC m=+0.355686800 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1) Oct 14 05:04:35 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:04:49 localhost systemd[1]: tmp-crun.CEANNL.mount: Deactivated successfully. Oct 14 05:04:49 localhost podman[102128]: 2025-10-14 09:04:49.758531401 +0000 UTC m=+0.101598846 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, release=2, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-type=git) Oct 14 05:04:49 localhost podman[102129]: 2025-10-14 09:04:49.796059323 +0000 UTC m=+0.136270889 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, release=1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:04:49 localhost podman[102128]: 2025-10-14 09:04:49.825633058 +0000 UTC m=+0.168700563 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step3, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=2, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 05:04:49 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:04:49 localhost podman[102129]: 2025-10-14 09:04:49.880594719 +0000 UTC m=+0.220806345 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, container_name=iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:04:49 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:04:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:04:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:04:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:04:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:04:56 localhost systemd[1]: tmp-crun.H4kUN6.mount: Deactivated successfully. Oct 14 05:04:56 localhost podman[102296]: 2025-10-14 09:04:56.74373728 +0000 UTC m=+0.083042082 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container) Oct 14 05:04:56 localhost systemd[1]: tmp-crun.GMFYpC.mount: Deactivated successfully. Oct 14 05:04:56 localhost podman[102305]: 2025-10-14 09:04:56.78801014 +0000 UTC m=+0.118398796 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1) Oct 14 05:04:56 localhost podman[102296]: 2025-10-14 09:04:56.798235776 +0000 UTC m=+0.137540548 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=) Oct 14 05:04:56 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:04:56 localhost podman[102305]: 2025-10-14 09:04:56.824013604 +0000 UTC m=+0.154402310 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1) Oct 14 05:04:56 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:04:56 localhost podman[102298]: 2025-10-14 09:04:56.851068482 +0000 UTC m=+0.184708559 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, tcib_managed=true, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, name=rhosp17/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 05:04:56 localhost podman[102298]: 2025-10-14 09:04:56.864057684 +0000 UTC m=+0.197697751 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-type=git, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 05:04:56 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:04:56 localhost podman[102297]: 2025-10-14 09:04:56.763937145 +0000 UTC m=+0.102032479 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, release=1, vcs-type=git, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Oct 14 05:04:57 localhost podman[102297]: 2025-10-14 09:04:57.127059844 +0000 UTC m=+0.465155218 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, build-date=2025-07-21T14:48:37, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Oct 14 05:04:57 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:04:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:04:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:04:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:04:59 localhost podman[102393]: 2025-10-14 09:04:59.746301023 +0000 UTC m=+0.081076870 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, io.buildah.version=1.33.12, release=1, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller) Oct 14 05:04:59 localhost podman[102393]: 2025-10-14 09:04:59.786078664 +0000 UTC m=+0.120854511 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.9, container_name=ovn_controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:04:59 localhost podman[102393]: unhealthy Oct 14 05:04:59 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:04:59 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:04:59 localhost podman[102394]: 2025-10-14 09:04:59.80405653 +0000 UTC m=+0.130990784 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, container_name=nova_compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5) Oct 14 05:04:59 localhost podman[102394]: 2025-10-14 09:04:59.836043201 +0000 UTC m=+0.162977465 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, io.openshift.expose-services=, tcib_managed=true) Oct 14 05:04:59 localhost systemd[1]: tmp-crun.rKVeRo.mount: Deactivated successfully. Oct 14 05:04:59 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:04:59 localhost podman[102392]: 2025-10-14 09:04:59.85638118 +0000 UTC m=+0.191301482 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, release=1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 05:04:59 localhost podman[102392]: 2025-10-14 09:04:59.870723234 +0000 UTC m=+0.205643536 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:04:59 localhost podman[102392]: unhealthy Oct 14 05:04:59 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:04:59 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:05:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:05:05 localhost podman[102457]: 2025-10-14 09:05:05.728842858 +0000 UTC m=+0.074651941 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 05:05:05 localhost podman[102457]: 2025-10-14 09:05:05.914140313 +0000 UTC m=+0.259949386 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:05:05 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:05:20 localhost podman[102487]: 2025-10-14 09:05:20.75174704 +0000 UTC m=+0.080608566 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public) Oct 14 05:05:20 localhost podman[102487]: 2025-10-14 09:05:20.75917154 +0000 UTC m=+0.088033076 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 05:05:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:05:20 localhost systemd[1]: tmp-crun.Pbnyjj.mount: Deactivated successfully. Oct 14 05:05:20 localhost podman[102486]: 2025-10-14 09:05:20.816920907 +0000 UTC m=+0.146164275 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, version=17.1.9) Oct 14 05:05:20 localhost podman[102486]: 2025-10-14 09:05:20.825376588 +0000 UTC m=+0.154619986 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., release=2, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd) Oct 14 05:05:20 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:05:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:05:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:05:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:05:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:05:27 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:05:27 localhost recover_tripleo_nova_virtqemud[102545]: 62551 Oct 14 05:05:27 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:05:27 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:05:27 localhost podman[102525]: 2025-10-14 09:05:27.724891604 +0000 UTC m=+0.067746268 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:05:27 localhost podman[102525]: 2025-10-14 09:05:27.775264803 +0000 UTC m=+0.118119447 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:05:27 localhost systemd[1]: tmp-crun.NJW6zN.mount: Deactivated successfully. Oct 14 05:05:27 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:05:27 localhost podman[102528]: 2025-10-14 09:05:27.793869479 +0000 UTC m=+0.129523211 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Oct 14 05:05:27 localhost podman[102528]: 2025-10-14 09:05:27.840353127 +0000 UTC m=+0.176006829 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9) Oct 14 05:05:27 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:05:27 localhost podman[102527]: 2025-10-14 09:05:27.887400873 +0000 UTC m=+0.223848009 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, release=1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, name=rhosp17/openstack-cron) Oct 14 05:05:27 localhost podman[102526]: 2025-10-14 09:05:27.844530237 +0000 UTC m=+0.179979352 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=) Oct 14 05:05:27 localhost podman[102527]: 2025-10-14 09:05:27.922464399 +0000 UTC m=+0.258911535 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1) Oct 14 05:05:27 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:05:28 localhost podman[102526]: 2025-10-14 09:05:28.246157027 +0000 UTC m=+0.581606122 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1) Oct 14 05:05:28 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:05:28 localhost systemd[1]: tmp-crun.wGSkTe.mount: Deactivated successfully. Oct 14 05:05:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:05:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:05:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:05:30 localhost podman[102621]: 2025-10-14 09:05:30.789845418 +0000 UTC m=+0.090659347 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, config_id=tripleo_step4, distribution-scope=public) Oct 14 05:05:30 localhost podman[102621]: 2025-10-14 09:05:30.838109891 +0000 UTC m=+0.138923850 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1) Oct 14 05:05:30 localhost podman[102621]: unhealthy Oct 14 05:05:30 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:05:30 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:05:30 localhost podman[102622]: 2025-10-14 09:05:30.894949301 +0000 UTC m=+0.192419477 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20250721.1, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:05:30 localhost podman[102623]: 2025-10-14 09:05:30.849515254 +0000 UTC m=+0.141195561 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d) Oct 14 05:05:30 localhost podman[102623]: 2025-10-14 09:05:30.933158483 +0000 UTC m=+0.224838510 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 05:05:30 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:05:30 localhost podman[102622]: 2025-10-14 09:05:30.984081849 +0000 UTC m=+0.281552065 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, release=1) Oct 14 05:05:30 localhost podman[102622]: unhealthy Oct 14 05:05:30 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:05:30 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:05:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:05:36 localhost systemd[1]: tmp-crun.Odrzj8.mount: Deactivated successfully. Oct 14 05:05:36 localhost podman[102687]: 2025-10-14 09:05:36.729119903 +0000 UTC m=+0.072192785 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.buildah.version=1.33.12, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:05:36 localhost podman[102687]: 2025-10-14 09:05:36.949800774 +0000 UTC m=+0.292873716 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible) Oct 14 05:05:36 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:05:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:05:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:05:51 localhost podman[102716]: 2025-10-14 09:05:51.756174853 +0000 UTC m=+0.090800090 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, architecture=x86_64, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, container_name=collectd, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, version=17.1.9) Oct 14 05:05:51 localhost podman[102716]: 2025-10-14 09:05:51.790384383 +0000 UTC m=+0.125009590 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., release=2, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Oct 14 05:05:51 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:05:51 localhost podman[102717]: 2025-10-14 09:05:51.793916572 +0000 UTC m=+0.124932158 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, config_id=tripleo_step3, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=iscsid, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:05:51 localhost podman[102717]: 2025-10-14 09:05:51.873395402 +0000 UTC m=+0.204411018 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12) Oct 14 05:05:51 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:05:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:05:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:05:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:05:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:05:58 localhost podman[102835]: 2025-10-14 09:05:58.755327173 +0000 UTC m=+0.085980523 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, release=1, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3) Oct 14 05:05:58 localhost podman[102834]: 2025-10-14 09:05:58.808089426 +0000 UTC m=+0.142793290 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=logrotate_crond) Oct 14 05:05:58 localhost podman[102832]: 2025-10-14 09:05:58.729799323 +0000 UTC m=+0.072918928 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T15:29:47, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, version=17.1.9) Oct 14 05:05:58 localhost podman[102833]: 2025-10-14 09:05:58.785192397 +0000 UTC m=+0.125217146 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, batch=17.1_20250721.1) Oct 14 05:05:58 localhost podman[102835]: 2025-10-14 09:05:58.845196094 +0000 UTC m=+0.175849464 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step4, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Oct 14 05:05:58 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:05:58 localhost podman[102832]: 2025-10-14 09:05:58.863201001 +0000 UTC m=+0.206320596 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, distribution-scope=public, tcib_managed=true, release=1, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi) Oct 14 05:05:58 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:05:58 localhost podman[102834]: 2025-10-14 09:05:58.947215512 +0000 UTC m=+0.281919356 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:05:58 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:05:59 localhost podman[102833]: 2025-10-14 09:05:59.113067566 +0000 UTC m=+0.453092315 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, build-date=2025-07-21T14:48:37, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, release=1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container) Oct 14 05:05:59 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:05:59 localhost systemd[1]: tmp-crun.sAtTIg.mount: Deactivated successfully. Oct 14 05:06:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:06:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:06:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:06:01 localhost systemd[1]: tmp-crun.4oOyCr.mount: Deactivated successfully. Oct 14 05:06:01 localhost podman[102925]: 2025-10-14 09:06:01.741702493 +0000 UTC m=+0.078965205 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, architecture=x86_64) Oct 14 05:06:01 localhost podman[102926]: 2025-10-14 09:06:01.780207425 +0000 UTC m=+0.113091852 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, version=17.1.9, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true) Oct 14 05:06:01 localhost podman[102925]: 2025-10-14 09:06:01.809631215 +0000 UTC m=+0.146893977 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Oct 14 05:06:01 localhost podman[102927]: 2025-10-14 09:06:01.855487924 +0000 UTC m=+0.184223062 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, container_name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:06:01 localhost podman[102925]: unhealthy Oct 14 05:06:01 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:06:01 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:06:01 localhost podman[102926]: 2025-10-14 09:06:01.872630995 +0000 UTC m=+0.205515482 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_id=tripleo_step4, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, version=17.1.9, build-date=2025-07-21T13:28:44, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, batch=17.1_20250721.1) Oct 14 05:06:01 localhost podman[102926]: unhealthy Oct 14 05:06:01 localhost podman[102927]: 2025-10-14 09:06:01.890117976 +0000 UTC m=+0.218853104 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public) Oct 14 05:06:01 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:06:01 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:06:01 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:06:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:06:07 localhost podman[102990]: 2025-10-14 09:06:07.722915515 +0000 UTC m=+0.070141362 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:06:07 localhost podman[102990]: 2025-10-14 09:06:07.938099185 +0000 UTC m=+0.285325062 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, container_name=metrics_qdr, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1) Oct 14 05:06:07 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:06:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:06:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:06:22 localhost systemd[1]: tmp-crun.KVr6IR.mount: Deactivated successfully. Oct 14 05:06:22 localhost podman[103021]: 2025-10-14 09:06:22.747336917 +0000 UTC m=+0.087832599 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:06:22 localhost podman[103021]: 2025-10-14 09:06:22.75618173 +0000 UTC m=+0.096677392 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, version=17.1.9, architecture=x86_64, release=1, tcib_managed=true, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:06:22 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:06:22 localhost podman[103020]: 2025-10-14 09:06:22.830002276 +0000 UTC m=+0.170198649 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=2, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 05:06:22 localhost podman[103020]: 2025-10-14 09:06:22.844377691 +0000 UTC m=+0.184574084 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, version=17.1.9, container_name=collectd, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vendor=Red Hat, Inc.) Oct 14 05:06:22 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:06:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:06:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:06:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:06:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:06:29 localhost podman[103058]: 2025-10-14 09:06:29.741524352 +0000 UTC m=+0.080131312 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible) Oct 14 05:06:29 localhost systemd[1]: tmp-crun.t4Thci.mount: Deactivated successfully. Oct 14 05:06:29 localhost podman[103060]: 2025-10-14 09:06:29.80676139 +0000 UTC m=+0.142884903 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.9, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33) Oct 14 05:06:29 localhost podman[103057]: 2025-10-14 09:06:29.858279765 +0000 UTC m=+0.193775689 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12) Oct 14 05:06:29 localhost podman[103059]: 2025-10-14 09:06:29.893280958 +0000 UTC m=+0.232239489 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron) Oct 14 05:06:29 localhost podman[103059]: 2025-10-14 09:06:29.901959077 +0000 UTC m=+0.240917528 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron) Oct 14 05:06:29 localhost podman[103060]: 2025-10-14 09:06:29.911399829 +0000 UTC m=+0.247523422 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute) Oct 14 05:06:29 localhost podman[103057]: 2025-10-14 09:06:29.914109533 +0000 UTC m=+0.249605447 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:06:29 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:06:29 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:06:29 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:06:30 localhost podman[103058]: 2025-10-14 09:06:30.110047027 +0000 UTC m=+0.448653957 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Oct 14 05:06:30 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:06:32 localhost podman[103150]: 2025-10-14 09:06:32.764897448 +0000 UTC m=+0.092354509 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, release=1, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-nova-compute-container) Oct 14 05:06:32 localhost systemd[1]: tmp-crun.hxJrwE.mount: Deactivated successfully. Oct 14 05:06:32 localhost podman[103148]: 2025-10-14 09:06:32.822407579 +0000 UTC m=+0.153964567 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true) Oct 14 05:06:32 localhost podman[103148]: 2025-10-14 09:06:32.871050664 +0000 UTC m=+0.202607762 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12) Oct 14 05:06:32 localhost podman[103148]: unhealthy Oct 14 05:06:32 localhost podman[103149]: 2025-10-14 09:06:32.880908669 +0000 UTC m=+0.209710101 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller) Oct 14 05:06:32 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:06:32 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:06:32 localhost podman[103150]: 2025-10-14 09:06:32.896753319 +0000 UTC m=+0.224210410 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:06:32 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:06:32 localhost podman[103149]: 2025-10-14 09:06:32.925154708 +0000 UTC m=+0.253956120 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, vcs-type=git) Oct 14 05:06:32 localhost podman[103149]: unhealthy Oct 14 05:06:32 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:06:32 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:06:33 localhost systemd[1]: tmp-crun.w1QJTc.mount: Deactivated successfully. Oct 14 05:06:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:06:38 localhost systemd[1]: tmp-crun.HM2mjP.mount: Deactivated successfully. Oct 14 05:06:38 localhost podman[103212]: 2025-10-14 09:06:38.774338326 +0000 UTC m=+0.091057179 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1, architecture=x86_64, batch=17.1_20250721.1) Oct 14 05:06:38 localhost podman[103212]: 2025-10-14 09:06:38.991135186 +0000 UTC m=+0.307854029 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T13:07:59) Oct 14 05:06:39 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:06:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:06:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:06:53 localhost podman[103242]: 2025-10-14 09:06:53.789781027 +0000 UTC m=+0.118554620 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.9, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3) Oct 14 05:06:53 localhost podman[103242]: 2025-10-14 09:06:53.804071389 +0000 UTC m=+0.132844972 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=iscsid, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3) Oct 14 05:06:53 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:06:53 localhost podman[103241]: 2025-10-14 09:06:53.89327909 +0000 UTC m=+0.227826292 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., container_name=collectd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Oct 14 05:06:53 localhost podman[103241]: 2025-10-14 09:06:53.907019316 +0000 UTC m=+0.241566528 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, release=2, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, batch=17.1_20250721.1, tcib_managed=true) Oct 14 05:06:53 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:06:56 localhost podman[103383]: 2025-10-14 09:06:56.491331833 +0000 UTC m=+0.082853976 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, architecture=x86_64, version=7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12) Oct 14 05:06:56 localhost podman[103383]: 2025-10-14 09:06:56.615203967 +0000 UTC m=+0.206726150 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, release=553) Oct 14 05:07:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:07:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:07:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:07:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:07:00 localhost podman[103525]: 2025-10-14 09:07:00.759156546 +0000 UTC m=+0.086316892 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc.) Oct 14 05:07:00 localhost podman[103527]: 2025-10-14 09:07:00.777135032 +0000 UTC m=+0.094585967 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Oct 14 05:07:00 localhost podman[103526]: 2025-10-14 09:07:00.832341171 +0000 UTC m=+0.154533553 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., release=1, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T14:48:37, container_name=nova_migration_target) Oct 14 05:07:00 localhost podman[103525]: 2025-10-14 09:07:00.850103361 +0000 UTC m=+0.177263737 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1, config_id=tripleo_step4, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:07:00 localhost podman[103527]: 2025-10-14 09:07:00.861437012 +0000 UTC m=+0.178887937 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1, vcs-type=git, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 05:07:00 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:07:00 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:07:00 localhost podman[103534]: 2025-10-14 09:07:00.936245637 +0000 UTC m=+0.247070758 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, release=1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:07:00 localhost podman[103534]: 2025-10-14 09:07:00.994227872 +0000 UTC m=+0.305052943 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc.) Oct 14 05:07:01 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:07:01 localhost podman[103526]: 2025-10-14 09:07:01.183220141 +0000 UTC m=+0.505412503 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, container_name=nova_migration_target, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public) Oct 14 05:07:01 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:07:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:07:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:07:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:07:03 localhost systemd[1]: tmp-crun.9VJTXS.mount: Deactivated successfully. Oct 14 05:07:03 localhost podman[103618]: 2025-10-14 09:07:03.759518241 +0000 UTC m=+0.092269177 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, container_name=ovn_controller, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public) Oct 14 05:07:03 localhost systemd[1]: tmp-crun.yCeZvi.mount: Deactivated successfully. Oct 14 05:07:03 localhost podman[103618]: 2025-10-14 09:07:03.803318367 +0000 UTC m=+0.136069303 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:07:03 localhost podman[103618]: unhealthy Oct 14 05:07:03 localhost podman[103617]: 2025-10-14 09:07:03.81118866 +0000 UTC m=+0.145694571 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, build-date=2025-07-21T16:28:53) Oct 14 05:07:03 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:07:03 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:07:03 localhost podman[103619]: 2025-10-14 09:07:03.857559605 +0000 UTC m=+0.186579856 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public) Oct 14 05:07:03 localhost podman[103617]: 2025-10-14 09:07:03.880638259 +0000 UTC m=+0.215144180 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, distribution-scope=public, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 05:07:03 localhost podman[103617]: unhealthy Oct 14 05:07:03 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:07:03 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:07:03 localhost podman[103619]: 2025-10-14 09:07:03.93332999 +0000 UTC m=+0.262350251 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:07:03 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:07:09 localhost systemd[1]: tmp-crun.GyaeG4.mount: Deactivated successfully. Oct 14 05:07:09 localhost podman[103685]: 2025-10-14 09:07:09.763883101 +0000 UTC m=+0.101664427 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1) Oct 14 05:07:09 localhost podman[103685]: 2025-10-14 09:07:09.973244801 +0000 UTC m=+0.311026057 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.33.12) Oct 14 05:07:09 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:07:24 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:07:24 localhost recover_tripleo_nova_virtqemud[103726]: 62551 Oct 14 05:07:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:07:24 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:07:24 localhost systemd[1]: tmp-crun.ABMU23.mount: Deactivated successfully. Oct 14 05:07:24 localhost podman[103715]: 2025-10-14 09:07:24.745320408 +0000 UTC m=+0.077563011 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=1, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=) Oct 14 05:07:24 localhost podman[103715]: 2025-10-14 09:07:24.786256606 +0000 UTC m=+0.118499189 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, release=1, architecture=x86_64, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1) Oct 14 05:07:24 localhost systemd[1]: tmp-crun.coSKLy.mount: Deactivated successfully. Oct 14 05:07:24 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:07:24 localhost podman[103714]: 2025-10-14 09:07:24.806778271 +0000 UTC m=+0.148204969 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, release=2, tcib_managed=true, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:07:24 localhost podman[103714]: 2025-10-14 09:07:24.821135945 +0000 UTC m=+0.162562673 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, container_name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container) Oct 14 05:07:24 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:07:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:07:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:07:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:07:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:07:31 localhost podman[103755]: 2025-10-14 09:07:31.755470451 +0000 UTC m=+0.094686212 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:07:31 localhost podman[103756]: 2025-10-14 09:07:31.816001434 +0000 UTC m=+0.149708764 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, container_name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, release=1, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=) Oct 14 05:07:31 localhost podman[103758]: 2025-10-14 09:07:31.879571742 +0000 UTC m=+0.208860746 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., release=1, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20250721.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:07:31 localhost podman[103755]: 2025-10-14 09:07:31.887482237 +0000 UTC m=+0.226697988 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc.) Oct 14 05:07:31 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:07:31 localhost podman[103758]: 2025-10-14 09:07:31.914170283 +0000 UTC m=+0.243459287 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, architecture=x86_64, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1) Oct 14 05:07:31 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:07:31 localhost podman[103757]: 2025-10-14 09:07:31.97029891 +0000 UTC m=+0.300573684 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, tcib_managed=true, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=) Oct 14 05:07:31 localhost podman[103757]: 2025-10-14 09:07:31.979757833 +0000 UTC m=+0.310032617 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Oct 14 05:07:31 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:07:32 localhost podman[103756]: 2025-10-14 09:07:32.194149838 +0000 UTC m=+0.527857058 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, vendor=Red Hat, Inc., version=17.1.9) Oct 14 05:07:32 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:07:32 localhost systemd[1]: tmp-crun.3SKiSE.mount: Deactivated successfully. Oct 14 05:07:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:07:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:07:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:07:34 localhost podman[103848]: 2025-10-14 09:07:34.743664658 +0000 UTC m=+0.081884536 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, container_name=ovn_metadata_agent, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 05:07:34 localhost podman[103849]: 2025-10-14 09:07:34.799707232 +0000 UTC m=+0.135112172 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, release=1) Oct 14 05:07:34 localhost podman[103849]: 2025-10-14 09:07:34.81513347 +0000 UTC m=+0.150538410 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 05:07:34 localhost podman[103849]: unhealthy Oct 14 05:07:34 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:07:34 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:07:34 localhost podman[103848]: 2025-10-14 09:07:34.867217532 +0000 UTC m=+0.205437480 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., release=1, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 05:07:34 localhost podman[103848]: unhealthy Oct 14 05:07:34 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:07:34 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:07:34 localhost podman[103850]: 2025-10-14 09:07:34.958732094 +0000 UTC m=+0.291097830 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37) Oct 14 05:07:35 localhost podman[103850]: 2025-10-14 09:07:35.01289641 +0000 UTC m=+0.345262146 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:07:35 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:07:40 localhost systemd[1]: tmp-crun.VCveKc.mount: Deactivated successfully. Oct 14 05:07:40 localhost podman[103914]: 2025-10-14 09:07:40.720042061 +0000 UTC m=+0.064518908 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, tcib_managed=true, name=rhosp17/openstack-qdrouterd) Oct 14 05:07:40 localhost podman[103914]: 2025-10-14 09:07:40.94618237 +0000 UTC m=+0.290659147 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12) Oct 14 05:07:40 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:07:55 localhost podman[103943]: 2025-10-14 09:07:55.745565984 +0000 UTC m=+0.084841617 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, container_name=collectd, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03) Oct 14 05:07:55 localhost podman[103943]: 2025-10-14 09:07:55.759236847 +0000 UTC m=+0.098512520 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, release=2, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2) Oct 14 05:07:55 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:07:55 localhost podman[103944]: 2025-10-14 09:07:55.808963326 +0000 UTC m=+0.143344757 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, container_name=iscsid, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Oct 14 05:07:55 localhost podman[103944]: 2025-10-14 09:07:55.849074767 +0000 UTC m=+0.183456178 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 05:07:55 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:08:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:08:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:08:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:08:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:08:02 localhost podman[104060]: 2025-10-14 09:08:02.758349005 +0000 UTC m=+0.091275026 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, container_name=logrotate_crond, name=rhosp17/openstack-cron, release=1, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52) Oct 14 05:08:02 localhost podman[104061]: 2025-10-14 09:08:02.767441547 +0000 UTC m=+0.092336199 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Oct 14 05:08:02 localhost podman[104060]: 2025-10-14 09:08:02.802963765 +0000 UTC m=+0.135889806 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 05:08:02 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:08:02 localhost podman[104059]: 2025-10-14 09:08:02.824039948 +0000 UTC m=+0.157058102 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:08:02 localhost podman[104061]: 2025-10-14 09:08:02.832276453 +0000 UTC m=+0.157171075 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:45:33, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:08:02 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:08:02 localhost podman[104058]: 2025-10-14 09:08:02.801643615 +0000 UTC m=+0.134667739 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, version=17.1.9) Oct 14 05:08:02 localhost podman[104058]: 2025-10-14 09:08:02.909739271 +0000 UTC m=+0.242763375 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, release=1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1) Oct 14 05:08:02 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:08:03 localhost podman[104059]: 2025-10-14 09:08:03.202287645 +0000 UTC m=+0.535305819 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2025-07-21T14:48:37, distribution-scope=public, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:08:03 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:08:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:08:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:08:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:08:05 localhost systemd[1]: tmp-crun.JOaqEv.mount: Deactivated successfully. Oct 14 05:08:05 localhost podman[104157]: 2025-10-14 09:08:05.757217023 +0000 UTC m=+0.087471318 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1) Oct 14 05:08:05 localhost podman[104157]: 2025-10-14 09:08:05.781069792 +0000 UTC m=+0.111324097 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:08:05 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:08:05 localhost podman[104156]: 2025-10-14 09:08:05.733871131 +0000 UTC m=+0.074208609 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container) Oct 14 05:08:05 localhost podman[104155]: 2025-10-14 09:08:05.851558933 +0000 UTC m=+0.191816418 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step4, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, release=1, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12) Oct 14 05:08:05 localhost podman[104156]: 2025-10-14 09:08:05.864536554 +0000 UTC m=+0.204874072 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.9, name=rhosp17/openstack-ovn-controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:08:05 localhost podman[104156]: unhealthy Oct 14 05:08:05 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:08:05 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:08:05 localhost podman[104155]: 2025-10-14 09:08:05.897050841 +0000 UTC m=+0.237308346 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, release=1, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2025-07-21T16:28:53, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 05:08:05 localhost podman[104155]: unhealthy Oct 14 05:08:05 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:08:05 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:08:11 localhost podman[104217]: 2025-10-14 09:08:11.73921259 +0000 UTC m=+0.076073395 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:08:11 localhost podman[104217]: 2025-10-14 09:08:11.915186547 +0000 UTC m=+0.252047392 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git) Oct 14 05:08:11 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:08:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:08:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:08:26 localhost systemd[1]: tmp-crun.aam5V0.mount: Deactivated successfully. Oct 14 05:08:26 localhost podman[104248]: 2025-10-14 09:08:26.748606786 +0000 UTC m=+0.079718298 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, version=17.1.9, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:08:26 localhost podman[104248]: 2025-10-14 09:08:26.786034335 +0000 UTC m=+0.117145847 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 05:08:26 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:08:26 localhost podman[104247]: 2025-10-14 09:08:26.789947456 +0000 UTC m=+0.123338379 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, release=2, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step3, tcib_managed=true) Oct 14 05:08:26 localhost podman[104247]: 2025-10-14 09:08:26.875329938 +0000 UTC m=+0.208720851 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd) Oct 14 05:08:26 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:08:33 localhost podman[104286]: 2025-10-14 09:08:33.76463103 +0000 UTC m=+0.096560640 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, release=1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:08:33 localhost podman[104286]: 2025-10-14 09:08:33.833351606 +0000 UTC m=+0.165281176 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Oct 14 05:08:33 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:08:33 localhost podman[104288]: 2025-10-14 09:08:33.835144882 +0000 UTC m=+0.158505027 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, release=1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52) Oct 14 05:08:33 localhost podman[104287]: 2025-10-14 09:08:33.888950107 +0000 UTC m=+0.215004506 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 05:08:33 localhost podman[104288]: 2025-10-14 09:08:33.914645252 +0000 UTC m=+0.238005417 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-cron) Oct 14 05:08:33 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:08:33 localhost podman[104294]: 2025-10-14 09:08:33.935424856 +0000 UTC m=+0.252253578 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Oct 14 05:08:33 localhost podman[104294]: 2025-10-14 09:08:33.993228724 +0000 UTC m=+0.310057456 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20250721.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, release=1, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public) Oct 14 05:08:34 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:08:34 localhost podman[104287]: 2025-10-14 09:08:34.265217973 +0000 UTC m=+0.591272422 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4) Oct 14 05:08:34 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:08:34 localhost systemd[1]: tmp-crun.i26dIa.mount: Deactivated successfully. Oct 14 05:08:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:08:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:08:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:08:36 localhost podman[104380]: 2025-10-14 09:08:36.746333447 +0000 UTC m=+0.080513713 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, vcs-type=git, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1) Oct 14 05:08:36 localhost podman[104380]: 2025-10-14 09:08:36.792030131 +0000 UTC m=+0.126210407 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, tcib_managed=true) Oct 14 05:08:36 localhost podman[104380]: unhealthy Oct 14 05:08:36 localhost podman[104382]: 2025-10-14 09:08:36.809224804 +0000 UTC m=+0.134242305 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, architecture=x86_64) Oct 14 05:08:36 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:08:36 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:08:36 localhost podman[104382]: 2025-10-14 09:08:36.862028178 +0000 UTC m=+0.187045599 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, release=1) Oct 14 05:08:36 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:08:36 localhost podman[104381]: 2025-10-14 09:08:36.862795212 +0000 UTC m=+0.191233050 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:08:36 localhost podman[104381]: 2025-10-14 09:08:36.942343184 +0000 UTC m=+0.270781072 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9) Oct 14 05:08:36 localhost podman[104381]: unhealthy Oct 14 05:08:36 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:08:36 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:08:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:08:42 localhost systemd[1]: tmp-crun.K8kCKP.mount: Deactivated successfully. Oct 14 05:08:42 localhost podman[104447]: 2025-10-14 09:08:42.751018007 +0000 UTC m=+0.090889275 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step1) Oct 14 05:08:42 localhost podman[104447]: 2025-10-14 09:08:42.950078758 +0000 UTC m=+0.289949966 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, tcib_managed=true, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9) Oct 14 05:08:42 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:08:50 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:08:50 localhost recover_tripleo_nova_virtqemud[104478]: 62551 Oct 14 05:08:50 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:08:50 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:08:54 localhost sshd[104479]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:08:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:08:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:08:57 localhost systemd[1]: tmp-crun.lcMiNW.mount: Deactivated successfully. Oct 14 05:08:57 localhost podman[104481]: 2025-10-14 09:08:57.771546274 +0000 UTC m=+0.105236138 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9) Oct 14 05:08:57 localhost podman[104482]: 2025-10-14 09:08:57.804861195 +0000 UTC m=+0.138551089 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step3, release=1, distribution-scope=public, name=rhosp17/openstack-iscsid, container_name=iscsid, batch=17.1_20250721.1, tcib_managed=true, io.openshift.expose-services=) Oct 14 05:08:57 localhost podman[104482]: 2025-10-14 09:08:57.813945406 +0000 UTC m=+0.147635310 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64) Oct 14 05:08:57 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:08:57 localhost podman[104481]: 2025-10-14 09:08:57.832525922 +0000 UTC m=+0.166215756 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd, name=rhosp17/openstack-collectd, release=2, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc.) Oct 14 05:08:57 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:09:04 localhost systemd[1]: tmp-crun.Ymr4Ve.mount: Deactivated successfully. Oct 14 05:09:04 localhost podman[104595]: 2025-10-14 09:09:04.745757681 +0000 UTC m=+0.085697464 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi) Oct 14 05:09:04 localhost podman[104596]: 2025-10-14 09:09:04.716038011 +0000 UTC m=+0.057444829 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T14:48:37, version=17.1.9, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 05:09:04 localhost systemd[1]: tmp-crun.oC3AgV.mount: Deactivated successfully. Oct 14 05:09:04 localhost podman[104595]: 2025-10-14 09:09:04.79485998 +0000 UTC m=+0.134799743 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:09:04 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:09:04 localhost podman[104597]: 2025-10-14 09:09:04.808158002 +0000 UTC m=+0.143802141 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 05:09:04 localhost podman[104597]: 2025-10-14 09:09:04.81808006 +0000 UTC m=+0.153724209 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container) Oct 14 05:09:04 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:09:04 localhost podman[104598]: 2025-10-14 09:09:04.785529472 +0000 UTC m=+0.119637284 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:45:33, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 05:09:04 localhost podman[104598]: 2025-10-14 09:09:04.869181141 +0000 UTC m=+0.203288973 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, distribution-scope=public, release=1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:09:04 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:09:05 localhost podman[104596]: 2025-10-14 09:09:05.074290179 +0000 UTC m=+0.415697007 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Oct 14 05:09:05 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:09:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:09:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:09:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:09:07 localhost podman[104689]: 2025-10-14 09:09:07.75628498 +0000 UTC m=+0.088876572 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 05:09:07 localhost podman[104688]: 2025-10-14 09:09:07.802464929 +0000 UTC m=+0.139261141 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:09:07 localhost podman[104688]: 2025-10-14 09:09:07.817202955 +0000 UTC m=+0.153999147 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, release=1, version=17.1.9, tcib_managed=true, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=) Oct 14 05:09:07 localhost podman[104688]: unhealthy Oct 14 05:09:07 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:09:07 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:09:07 localhost podman[104689]: 2025-10-14 09:09:07.833094857 +0000 UTC m=+0.165686439 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.buildah.version=1.33.12, vcs-type=git, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, distribution-scope=public, version=17.1.9) Oct 14 05:09:07 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:09:07 localhost podman[104687]: 2025-10-14 09:09:07.900636228 +0000 UTC m=+0.241383942 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, version=17.1.9, vendor=Red Hat, Inc.) Oct 14 05:09:07 localhost podman[104687]: 2025-10-14 09:09:07.913543617 +0000 UTC m=+0.254291311 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true) Oct 14 05:09:07 localhost podman[104687]: unhealthy Oct 14 05:09:07 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:09:07 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:09:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:09:13 localhost podman[104753]: 2025-10-14 09:09:13.753046974 +0000 UTC m=+0.094208826 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, version=17.1.9, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 05:09:14 localhost podman[104753]: 2025-10-14 09:09:14.002947488 +0000 UTC m=+0.344109210 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:09:14 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:09:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:09:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:09:28 localhost podman[104783]: 2025-10-14 09:09:28.726457018 +0000 UTC m=+0.068234803 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, managed_by=tripleo_ansible, release=2, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git) Oct 14 05:09:28 localhost systemd[1]: tmp-crun.6YAWjk.mount: Deactivated successfully. Oct 14 05:09:28 localhost podman[104783]: 2025-10-14 09:09:28.745185158 +0000 UTC m=+0.086962823 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, release=2, tcib_managed=true, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03) Oct 14 05:09:28 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:09:28 localhost podman[104784]: 2025-10-14 09:09:28.746470667 +0000 UTC m=+0.084228358 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:09:28 localhost podman[104784]: 2025-10-14 09:09:28.830522849 +0000 UTC m=+0.168280610 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 05:09:28 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:09:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:09:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:09:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:09:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:09:35 localhost systemd[1]: tmp-crun.J0YDwB.mount: Deactivated successfully. Oct 14 05:09:35 localhost podman[104825]: 2025-10-14 09:09:35.746024049 +0000 UTC m=+0.081007038 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public) Oct 14 05:09:35 localhost systemd[1]: tmp-crun.R1xXWh.mount: Deactivated successfully. Oct 14 05:09:35 localhost podman[104823]: 2025-10-14 09:09:35.796008366 +0000 UTC m=+0.134432781 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute) Oct 14 05:09:35 localhost podman[104824]: 2025-10-14 09:09:35.770824887 +0000 UTC m=+0.105276510 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, batch=17.1_20250721.1, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52) Oct 14 05:09:35 localhost podman[104824]: 2025-10-14 09:09:35.854027271 +0000 UTC m=+0.188478884 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20250721.1, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, release=1, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T13:07:52) Oct 14 05:09:35 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:09:35 localhost podman[104825]: 2025-10-14 09:09:35.870282325 +0000 UTC m=+0.205265334 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, distribution-scope=public, release=1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute) Oct 14 05:09:35 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Deactivated successfully. Oct 14 05:09:35 localhost podman[104822]: 2025-10-14 09:09:35.909397225 +0000 UTC m=+0.249135022 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, version=17.1.9, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47) Oct 14 05:09:35 localhost podman[104822]: 2025-10-14 09:09:35.969491926 +0000 UTC m=+0.309229713 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, version=17.1.9, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.33.12, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1) Oct 14 05:09:35 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:09:36 localhost podman[104823]: 2025-10-14 09:09:36.174156 +0000 UTC m=+0.512580485 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public) Oct 14 05:09:36 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:09:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:09:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:09:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:09:38 localhost systemd[1]: tmp-crun.63foeg.mount: Deactivated successfully. Oct 14 05:09:38 localhost podman[104915]: 2025-10-14 09:09:38.752283185 +0000 UTC m=+0.089782250 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4) Oct 14 05:09:38 localhost systemd[1]: tmp-crun.CnHNUN.mount: Deactivated successfully. Oct 14 05:09:38 localhost podman[104916]: 2025-10-14 09:09:38.767136254 +0000 UTC m=+0.098049625 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container) Oct 14 05:09:38 localhost podman[104915]: 2025-10-14 09:09:38.795127941 +0000 UTC m=+0.132627006 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 05:09:38 localhost podman[104915]: unhealthy Oct 14 05:09:38 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:09:38 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:09:38 localhost podman[104916]: 2025-10-14 09:09:38.806165042 +0000 UTC m=+0.137078423 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:09:38 localhost podman[104916]: unhealthy Oct 14 05:09:38 localhost podman[104917]: 2025-10-14 09:09:38.817167653 +0000 UTC m=+0.143993147 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:09:38 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:09:38 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:09:38 localhost podman[104917]: 2025-10-14 09:09:38.868248814 +0000 UTC m=+0.195074288 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 05:09:38 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Deactivated successfully. Oct 14 05:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:09:44 localhost podman[104976]: 2025-10-14 09:09:44.730567218 +0000 UTC m=+0.068086138 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:09:44 localhost podman[104976]: 2025-10-14 09:09:44.897414293 +0000 UTC m=+0.234933223 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:07:59, release=1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.9) Oct 14 05:09:44 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:09:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:09:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:09:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:09:59 localhost recover_tripleo_nova_virtqemud[105018]: 62551 Oct 14 05:09:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:09:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:09:59 localhost podman[105007]: 2025-10-14 09:09:59.756341778 +0000 UTC m=+0.092938188 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, version=17.1.9, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, container_name=iscsid, vcs-type=git) Oct 14 05:09:59 localhost podman[105007]: 2025-10-14 09:09:59.770096953 +0000 UTC m=+0.106693353 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Oct 14 05:09:59 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:09:59 localhost podman[105006]: 2025-10-14 09:09:59.732237261 +0000 UTC m=+0.074689133 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:09:59 localhost podman[105006]: 2025-10-14 09:09:59.818440399 +0000 UTC m=+0.160892311 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:04:03, vcs-type=git, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:09:59 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:10:06 localhost podman[105121]: 2025-10-14 09:10:06.743516278 +0000 UTC m=+0.074657372 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:10:06 localhost podman[105119]: 2025-10-14 09:10:06.767919863 +0000 UTC m=+0.102023549 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, release=1, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Oct 14 05:10:06 localhost podman[105120]: 2025-10-14 09:10:06.80529289 +0000 UTC m=+0.135644979 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, version=17.1.9, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=) Oct 14 05:10:06 localhost podman[105119]: 2025-10-14 09:10:06.822669437 +0000 UTC m=+0.156773133 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., version=17.1.9, release=1, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Oct 14 05:10:06 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:10:06 localhost podman[105122]: 2025-10-14 09:10:06.870213409 +0000 UTC m=+0.194839031 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, release=1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public) Oct 14 05:10:06 localhost podman[105121]: 2025-10-14 09:10:06.924819649 +0000 UTC m=+0.255960823 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.expose-services=) Oct 14 05:10:06 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:10:06 localhost podman[105122]: 2025-10-14 09:10:06.975606861 +0000 UTC m=+0.300232503 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, io.buildah.version=1.33.12, release=1, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:10:06 localhost podman[105122]: unhealthy Oct 14 05:10:06 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:06 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed with result 'exit-code'. Oct 14 05:10:07 localhost podman[105120]: 2025-10-14 09:10:07.170804002 +0000 UTC m=+0.501156131 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, distribution-scope=public, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:10:07 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:10:09 localhost podman[105215]: 2025-10-14 09:10:09.77485763 +0000 UTC m=+0.108307133 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, tcib_managed=true, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Oct 14 05:10:09 localhost podman[105215]: 2025-10-14 09:10:09.797605504 +0000 UTC m=+0.131055057 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, release=1, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, managed_by=tripleo_ansible) Oct 14 05:10:09 localhost podman[105215]: unhealthy Oct 14 05:10:09 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:09 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:10:09 localhost podman[105217]: 2025-10-14 09:10:09.807094707 +0000 UTC m=+0.134045489 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Oct 14 05:10:09 localhost systemd[1]: tmp-crun.hiZjhX.mount: Deactivated successfully. Oct 14 05:10:09 localhost podman[105216]: 2025-10-14 09:10:09.874788503 +0000 UTC m=+0.204138309 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, release=1, distribution-scope=public, container_name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:10:09 localhost podman[105217]: 2025-10-14 09:10:09.892177151 +0000 UTC m=+0.219127853 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, tcib_managed=true, version=17.1.9) Oct 14 05:10:09 localhost podman[105217]: unhealthy Oct 14 05:10:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:10:09 localhost podman[105216]: 2025-10-14 09:10:09.942517709 +0000 UTC m=+0.271867485 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, version=17.1.9, container_name=ovn_controller, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44, config_id=tripleo_step4) Oct 14 05:10:09 localhost podman[105216]: unhealthy Oct 14 05:10:09 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:09 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:10:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:10:15 localhost podman[105275]: 2025-10-14 09:10:15.749279074 +0000 UTC m=+0.086805338 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:10:15 localhost podman[105275]: 2025-10-14 09:10:15.947950923 +0000 UTC m=+0.285477127 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed) Oct 14 05:10:15 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:10:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:10:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:10:30 localhost podman[105307]: 2025-10-14 09:10:30.761236567 +0000 UTC m=+0.085817506 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, release=1, version=17.1.9, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, vendor=Red Hat, Inc.) Oct 14 05:10:30 localhost podman[105306]: 2025-10-14 09:10:30.802468004 +0000 UTC m=+0.134426272 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, tcib_managed=true, version=17.1.9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, vcs-type=git) Oct 14 05:10:30 localhost podman[105306]: 2025-10-14 09:10:30.844175855 +0000 UTC m=+0.176134123 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=2, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, name=rhosp17/openstack-collectd, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:10:30 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:10:30 localhost podman[105307]: 2025-10-14 09:10:30.856788204 +0000 UTC m=+0.181369153 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3) Oct 14 05:10:30 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:10:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:10:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:10:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:10:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:10:37 localhost systemd[1]: tmp-crun.O6oboB.mount: Deactivated successfully. Oct 14 05:10:37 localhost podman[105353]: 2025-10-14 09:10:37.789920101 +0000 UTC m=+0.113967958 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:10:37 localhost podman[105345]: 2025-10-14 09:10:37.744021661 +0000 UTC m=+0.081999759 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, distribution-scope=public, build-date=2025-07-21T15:29:47, architecture=x86_64) Oct 14 05:10:37 localhost podman[105346]: 2025-10-14 09:10:37.798100604 +0000 UTC m=+0.129515049 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, release=1, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:10:37 localhost podman[105353]: 2025-10-14 09:10:37.821085205 +0000 UTC m=+0.145133012 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Oct 14 05:10:37 localhost podman[105353]: unhealthy Oct 14 05:10:37 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:37 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed with result 'exit-code'. Oct 14 05:10:37 localhost podman[105347]: 2025-10-14 09:10:37.875112148 +0000 UTC m=+0.202547140 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Oct 14 05:10:37 localhost podman[105345]: 2025-10-14 09:10:37.878452822 +0000 UTC m=+0.216430930 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, io.openshift.expose-services=) Oct 14 05:10:37 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:10:37 localhost podman[105347]: 2025-10-14 09:10:37.905836119 +0000 UTC m=+0.233271081 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, architecture=x86_64, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.buildah.version=1.33.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1) Oct 14 05:10:37 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:10:38 localhost podman[105346]: 2025-10-14 09:10:38.150417619 +0000 UTC m=+0.481832124 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, version=17.1.9, io.buildah.version=1.33.12, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1) Oct 14 05:10:38 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:10:40 localhost podman[105441]: 2025-10-14 09:10:40.73356159 +0000 UTC m=+0.073429823 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53) Oct 14 05:10:40 localhost podman[105442]: 2025-10-14 09:10:40.793810546 +0000 UTC m=+0.128854930 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-ovn-controller-container) Oct 14 05:10:40 localhost podman[105442]: 2025-10-14 09:10:40.806860129 +0000 UTC m=+0.141904503 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T13:28:44, release=1, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64) Oct 14 05:10:40 localhost podman[105442]: unhealthy Oct 14 05:10:40 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:40 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:10:40 localhost podman[105441]: 2025-10-14 09:10:40.819408388 +0000 UTC m=+0.159276621 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=) Oct 14 05:10:40 localhost podman[105441]: unhealthy Oct 14 05:10:40 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:40 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:10:40 localhost podman[105443]: 2025-10-14 09:10:40.769471662 +0000 UTC m=+0.097944613 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, release=1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:10:40 localhost podman[105443]: 2025-10-14 09:10:40.904310825 +0000 UTC m=+0.232783766 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, tcib_managed=true, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:10:40 localhost podman[105443]: unhealthy Oct 14 05:10:40 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:10:40 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:10:41 localhost systemd[1]: tmp-crun.q0J8XF.mount: Deactivated successfully. Oct 14 05:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:10:46 localhost podman[105506]: 2025-10-14 09:10:46.758945261 +0000 UTC m=+0.092546966 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, release=1, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:10:46 localhost podman[105506]: 2025-10-14 09:10:46.997548225 +0000 UTC m=+0.331149910 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1) Oct 14 05:10:47 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:11:01 localhost podman[105536]: 2025-10-14 09:11:01.733102182 +0000 UTC m=+0.075220649 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.9, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:11:01 localhost podman[105536]: 2025-10-14 09:11:01.742908056 +0000 UTC m=+0.085026473 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible) Oct 14 05:11:01 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:11:01 localhost systemd[1]: tmp-crun.7BBmRL.mount: Deactivated successfully. Oct 14 05:11:01 localhost podman[105535]: 2025-10-14 09:11:01.792092728 +0000 UTC m=+0.134209445 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, build-date=2025-07-21T13:04:03, tcib_managed=true, container_name=collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team) Oct 14 05:11:01 localhost podman[105535]: 2025-10-14 09:11:01.831152697 +0000 UTC m=+0.173269374 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, com.redhat.component=openstack-collectd-container) Oct 14 05:11:01 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:11:08 localhost podman[105652]: 2025-10-14 09:11:08.769004 +0000 UTC m=+0.095633851 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52) Oct 14 05:11:08 localhost podman[105652]: 2025-10-14 09:11:08.775882222 +0000 UTC m=+0.102512103 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, release=1, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public) Oct 14 05:11:08 localhost systemd[1]: tmp-crun.T3gROW.mount: Deactivated successfully. Oct 14 05:11:08 localhost podman[105653]: 2025-10-14 09:11:08.82522951 +0000 UTC m=+0.145239656 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, release=1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.9) Oct 14 05:11:08 localhost podman[105653]: 2025-10-14 09:11:08.851985158 +0000 UTC m=+0.171995304 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute) Oct 14 05:11:08 localhost podman[105653]: unhealthy Oct 14 05:11:08 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:08 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed with result 'exit-code'. Oct 14 05:11:08 localhost podman[105650]: 2025-10-14 09:11:08.866622482 +0000 UTC m=+0.192993575 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:11:08 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:11:08 localhost podman[105651]: 2025-10-14 09:11:08.921911552 +0000 UTC m=+0.247832351 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=nova_migration_target, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible) Oct 14 05:11:08 localhost podman[105650]: 2025-10-14 09:11:08.931491069 +0000 UTC m=+0.257862122 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Oct 14 05:11:08 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:11:09 localhost podman[105651]: 2025-10-14 09:11:09.274149334 +0000 UTC m=+0.600070193 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 05:11:09 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:11:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:11:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:11:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:11:11 localhost systemd[1]: tmp-crun.Lx3v2D.mount: Deactivated successfully. Oct 14 05:11:11 localhost podman[105740]: 2025-10-14 09:11:11.763787301 +0000 UTC m=+0.099683446 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Oct 14 05:11:11 localhost podman[105740]: 2025-10-14 09:11:11.778829887 +0000 UTC m=+0.114725992 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, architecture=x86_64, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4) Oct 14 05:11:11 localhost podman[105740]: unhealthy Oct 14 05:11:11 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:11 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:11:11 localhost podman[105741]: 2025-10-14 09:11:11.851689322 +0000 UTC m=+0.185226174 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12) Oct 14 05:11:11 localhost podman[105741]: 2025-10-14 09:11:11.898226232 +0000 UTC m=+0.231763114 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, version=17.1.9, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:11:11 localhost podman[105741]: unhealthy Oct 14 05:11:11 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:11 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:11:11 localhost podman[105742]: 2025-10-14 09:11:11.911741401 +0000 UTC m=+0.242406084 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:11:11 localhost podman[105742]: 2025-10-14 09:11:11.935257898 +0000 UTC m=+0.265922581 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, build-date=2025-07-21T14:48:37, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1) Oct 14 05:11:11 localhost podman[105742]: unhealthy Oct 14 05:11:11 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:11 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:11:15 localhost sshd[105797]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:11:15 localhost systemd-logind[760]: New session 36 of user zuul. Oct 14 05:11:15 localhost systemd[1]: Started Session 36 of User zuul. Oct 14 05:11:16 localhost python3.9[105892]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:11:16 localhost python3.9[105986]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:11:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:11:17 localhost systemd[1]: tmp-crun.J5qoXb.mount: Deactivated successfully. Oct 14 05:11:17 localhost podman[106080]: 2025-10-14 09:11:17.617845651 +0000 UTC m=+0.104355492 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12) Oct 14 05:11:17 localhost python3.9[106079]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:11:17 localhost podman[106080]: 2025-10-14 09:11:17.809720819 +0000 UTC m=+0.296230680 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, release=1, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public) Oct 14 05:11:17 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:11:18 localhost python3.9[106201]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:11:19 localhost python3.9[106294]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:11:20 localhost python3.9[106385]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Oct 14 05:11:21 localhost python3.9[106476]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:11:22 localhost python3.9[106568]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Oct 14 05:11:23 localhost python3.9[106658]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:11:24 localhost python3.9[106706]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:11:25 localhost systemd[1]: session-36.scope: Deactivated successfully. Oct 14 05:11:25 localhost systemd[1]: session-36.scope: Consumed 5.164s CPU time. Oct 14 05:11:25 localhost systemd-logind[760]: Session 36 logged out. Waiting for processes to exit. Oct 14 05:11:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:11:25 localhost systemd-logind[760]: Removed session 36. Oct 14 05:11:25 localhost recover_tripleo_nova_virtqemud[106723]: 62551 Oct 14 05:11:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:11:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:11:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:11:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:11:32 localhost podman[106725]: 2025-10-14 09:11:32.733931566 +0000 UTC m=+0.069712457 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, release=1, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team) Oct 14 05:11:32 localhost podman[106725]: 2025-10-14 09:11:32.774227884 +0000 UTC m=+0.110008825 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., container_name=iscsid, build-date=2025-07-21T13:27:15, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Oct 14 05:11:32 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:11:32 localhost podman[106724]: 2025-10-14 09:11:32.80027026 +0000 UTC m=+0.136271809 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, vcs-type=git, name=rhosp17/openstack-collectd, release=2, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, batch=17.1_20250721.1, container_name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Oct 14 05:11:32 localhost podman[106724]: 2025-10-14 09:11:32.81415534 +0000 UTC m=+0.150156929 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, vcs-type=git, name=rhosp17/openstack-collectd, release=2, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:11:32 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:11:34 localhost sshd[106762]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:11:34 localhost systemd-logind[760]: New session 37 of user zuul. Oct 14 05:11:34 localhost systemd[1]: Started Session 37 of User zuul. Oct 14 05:11:35 localhost python3.9[106857]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:11:35 localhost systemd[1]: Reloading. Oct 14 05:11:35 localhost systemd-rc-local-generator[106879]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:11:35 localhost systemd-sysv-generator[106883]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:11:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:11:36 localhost python3.9[106983]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:11:36 localhost network[107000]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:11:36 localhost network[107001]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:11:36 localhost network[107002]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:11:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:11:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64425 DF PROTO=TCP SPT=58988 DPT=9105 SEQ=1371795636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFCADE0000000001030307) Oct 14 05:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:11:39 localhost podman[107051]: 2025-10-14 09:11:39.154752026 +0000 UTC m=+0.091021427 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:11:39 localhost podman[107051]: 2025-10-14 09:11:39.212391481 +0000 UTC m=+0.148660932 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., release=1) Oct 14 05:11:39 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:11:39 localhost podman[107052]: 2025-10-14 09:11:39.270563761 +0000 UTC m=+0.207721890 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:07:52, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c) Oct 14 05:11:39 localhost podman[107052]: 2025-10-14 09:11:39.305159412 +0000 UTC m=+0.242317561 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Oct 14 05:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:11:39 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:11:39 localhost podman[107053]: 2025-10-14 09:11:39.221355908 +0000 UTC m=+0.158363942 container health_status def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.33.12, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 05:11:39 localhost podman[107053]: 2025-10-14 09:11:39.359502724 +0000 UTC m=+0.296510718 container exec_died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 05:11:39 localhost podman[107053]: unhealthy Oct 14 05:11:39 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:39 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed with result 'exit-code'. Oct 14 05:11:39 localhost podman[107135]: 2025-10-14 09:11:39.40458499 +0000 UTC m=+0.072405332 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Oct 14 05:11:39 localhost podman[107135]: 2025-10-14 09:11:39.777047598 +0000 UTC m=+0.444867950 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Oct 14 05:11:39 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64426 DF PROTO=TCP SPT=58988 DPT=9105 SEQ=1371795636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFCEDA0000000001030307) Oct 14 05:11:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55077 DF PROTO=TCP SPT=45028 DPT=9100 SEQ=2061350281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFD0C50000000001030307) Oct 14 05:11:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55081 DF PROTO=TCP SPT=58618 DPT=9101 SEQ=4180357961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFD3A90000000001030307) Oct 14 05:11:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55078 DF PROTO=TCP SPT=45028 DPT=9100 SEQ=2061350281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFD4DA0000000001030307) Oct 14 05:11:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64427 DF PROTO=TCP SPT=58988 DPT=9105 SEQ=1371795636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFD6DB0000000001030307) Oct 14 05:11:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55082 DF PROTO=TCP SPT=58618 DPT=9101 SEQ=4180357961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFD79A0000000001030307) Oct 14 05:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:11:42 localhost podman[107219]: 2025-10-14 09:11:42.787249726 +0000 UTC m=+0.121907594 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-07-21T13:28:44, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:11:42 localhost podman[107218]: 2025-10-14 09:11:42.741887302 +0000 UTC m=+0.082250216 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, release=1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 05:11:42 localhost podman[107219]: 2025-10-14 09:11:42.807364708 +0000 UTC m=+0.142022526 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:28:44, distribution-scope=public, architecture=x86_64, container_name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 05:11:42 localhost podman[107219]: unhealthy Oct 14 05:11:42 localhost podman[107218]: 2025-10-14 09:11:42.82066271 +0000 UTC m=+0.161025634 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9) Oct 14 05:11:42 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:42 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:11:42 localhost podman[107218]: unhealthy Oct 14 05:11:42 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:42 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:11:42 localhost podman[107225]: 2025-10-14 09:11:42.76508588 +0000 UTC m=+0.092959498 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, release=1, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64) Oct 14 05:11:42 localhost podman[107225]: 2025-10-14 09:11:42.895338002 +0000 UTC m=+0.223211680 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, version=17.1.9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1) Oct 14 05:11:42 localhost podman[107225]: unhealthy Oct 14 05:11:42 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:11:42 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:11:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55079 DF PROTO=TCP SPT=45028 DPT=9100 SEQ=2061350281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFDCDA0000000001030307) Oct 14 05:11:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55083 DF PROTO=TCP SPT=58618 DPT=9101 SEQ=4180357961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFDF9B0000000001030307) Oct 14 05:11:44 localhost python3.9[107355]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:11:44 localhost network[107372]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:11:44 localhost network[107373]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:11:44 localhost network[107374]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:11:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:11:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64428 DF PROTO=TCP SPT=58988 DPT=9105 SEQ=1371795636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFE69A0000000001030307) Oct 14 05:11:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55080 DF PROTO=TCP SPT=45028 DPT=9100 SEQ=2061350281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFEC9A0000000001030307) Oct 14 05:11:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55084 DF PROTO=TCP SPT=58618 DPT=9101 SEQ=4180357961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AEFEF5B0000000001030307) Oct 14 05:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:11:48 localhost systemd[1]: tmp-crun.1iplq2.mount: Deactivated successfully. Oct 14 05:11:48 localhost podman[107496]: 2025-10-14 09:11:48.753120167 +0000 UTC m=+0.094179017 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, release=1, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible) Oct 14 05:11:48 localhost podman[107496]: 2025-10-14 09:11:48.986331574 +0000 UTC m=+0.327390384 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.buildah.version=1.33.12, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:07:59, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:11:49 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:11:49 localhost python3.9[107602]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:11:50 localhost systemd[1]: Reloading. Oct 14 05:11:50 localhost systemd-rc-local-generator[107625]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:11:50 localhost systemd-sysv-generator[107629]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:11:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:11:50 localhost systemd[1]: Stopping ceilometer_agent_compute container... Oct 14 05:11:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47077 DF PROTO=TCP SPT=42516 DPT=9102 SEQ=4016949133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF004640000000001030307) Oct 14 05:11:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47078 DF PROTO=TCP SPT=42516 DPT=9102 SEQ=4016949133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0085B0000000001030307) Oct 14 05:11:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47079 DF PROTO=TCP SPT=42516 DPT=9102 SEQ=4016949133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0105A0000000001030307) Oct 14 05:11:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22023 DF PROTO=TCP SPT=37288 DPT=9882 SEQ=4079629157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0172A0000000001030307) Oct 14 05:11:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22024 DF PROTO=TCP SPT=37288 DPT=9882 SEQ=4079629157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF01B1A0000000001030307) Oct 14 05:12:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47080 DF PROTO=TCP SPT=42516 DPT=9102 SEQ=4016949133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0201A0000000001030307) Oct 14 05:12:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22025 DF PROTO=TCP SPT=37288 DPT=9882 SEQ=4079629157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0231A0000000001030307) Oct 14 05:12:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:12:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:12:03 localhost systemd[1]: tmp-crun.qZKs9u.mount: Deactivated successfully. Oct 14 05:12:03 localhost podman[107657]: 2025-10-14 09:12:03.272767794 +0000 UTC m=+0.111071159 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, release=2, batch=17.1_20250721.1, architecture=x86_64) Oct 14 05:12:03 localhost podman[107657]: 2025-10-14 09:12:03.290162563 +0000 UTC m=+0.128465908 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, version=17.1.9, container_name=collectd, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20250721.1, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Oct 14 05:12:03 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:12:03 localhost podman[107658]: 2025-10-14 09:12:03.364973738 +0000 UTC m=+0.200389443 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, container_name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1) Oct 14 05:12:03 localhost podman[107658]: 2025-10-14 09:12:03.375321398 +0000 UTC m=+0.210737153 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1, vcs-type=git, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3) Oct 14 05:12:03 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:12:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22026 DF PROTO=TCP SPT=37288 DPT=9882 SEQ=4079629157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF032DA0000000001030307) Oct 14 05:12:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43391 DF PROTO=TCP SPT=45108 DPT=9105 SEQ=3851403259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0400F0000000001030307) Oct 14 05:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:12:09 localhost systemd[1]: tmp-crun.6zWlzk.mount: Deactivated successfully. Oct 14 05:12:09 localhost podman[107779]: Error: container def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e is not running Oct 14 05:12:09 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Main process exited, code=exited, status=125/n/a Oct 14 05:12:09 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed with result 'exit-code'. Oct 14 05:12:09 localhost podman[107777]: 2025-10-14 09:12:09.537835051 +0000 UTC m=+0.115827786 container health_status 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Oct 14 05:12:09 localhost podman[107777]: 2025-10-14 09:12:09.61890885 +0000 UTC m=+0.196901585 container exec_died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, release=1, distribution-scope=public, version=17.1.9, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:12:09 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Deactivated successfully. Oct 14 05:12:09 localhost podman[107778]: 2025-10-14 09:12:09.684327364 +0000 UTC m=+0.262156014 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, container_name=logrotate_crond, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, distribution-scope=public) Oct 14 05:12:09 localhost podman[107778]: 2025-10-14 09:12:09.696165472 +0000 UTC m=+0.273994112 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, container_name=logrotate_crond, io.buildah.version=1.33.12, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron) Oct 14 05:12:09 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43392 DF PROTO=TCP SPT=45108 DPT=9105 SEQ=3851403259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0441A0000000001030307) Oct 14 05:12:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30374 DF PROTO=TCP SPT=55272 DPT=9100 SEQ=908697809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF045F70000000001030307) Oct 14 05:12:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:12:10 localhost podman[107834]: 2025-10-14 09:12:10.460274 +0000 UTC m=+0.056339065 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, distribution-scope=public, container_name=nova_migration_target, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9) Oct 14 05:12:10 localhost systemd[1]: tmp-crun.SftoMw.mount: Deactivated successfully. Oct 14 05:12:10 localhost podman[107834]: 2025-10-14 09:12:10.820115698 +0000 UTC m=+0.416180743 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, container_name=nova_migration_target, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:12:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:12:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30135 DF PROTO=TCP SPT=35728 DPT=9101 SEQ=4113503426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF048D90000000001030307) Oct 14 05:12:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30375 DF PROTO=TCP SPT=55272 DPT=9100 SEQ=908697809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF04A1A0000000001030307) Oct 14 05:12:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43393 DF PROTO=TCP SPT=45108 DPT=9105 SEQ=3851403259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF04C1A0000000001030307) Oct 14 05:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:12:13 localhost systemd[1]: tmp-crun.m91sqL.mount: Deactivated successfully. Oct 14 05:12:13 localhost podman[107859]: 2025-10-14 09:12:13.242397959 +0000 UTC m=+0.071225286 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T13:28:44, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.33.12) Oct 14 05:12:13 localhost podman[107859]: 2025-10-14 09:12:13.252441059 +0000 UTC m=+0.081268426 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, release=1, container_name=ovn_controller, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4) Oct 14 05:12:13 localhost podman[107859]: unhealthy Oct 14 05:12:13 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:12:13 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:12:13 localhost podman[107865]: 2025-10-14 09:12:13.301655643 +0000 UTC m=+0.127550119 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git) Oct 14 05:12:13 localhost podman[107865]: 2025-10-14 09:12:13.31350857 +0000 UTC m=+0.139403056 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-nova-compute-container) Oct 14 05:12:13 localhost podman[107865]: unhealthy Oct 14 05:12:13 localhost podman[107858]: 2025-10-14 09:12:13.221971536 +0000 UTC m=+0.059405619 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Oct 14 05:12:13 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:12:13 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:12:13 localhost podman[107858]: 2025-10-14 09:12:13.351153964 +0000 UTC m=+0.188588077 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, release=1, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Oct 14 05:12:13 localhost podman[107858]: unhealthy Oct 14 05:12:13 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:12:13 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:12:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43394 DF PROTO=TCP SPT=45108 DPT=9105 SEQ=3851403259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF05BDB0000000001030307) Oct 14 05:12:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30138 DF PROTO=TCP SPT=35728 DPT=9101 SEQ=4113503426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0649A0000000001030307) Oct 14 05:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:12:19 localhost systemd[1]: tmp-crun.vwvVu6.mount: Deactivated successfully. Oct 14 05:12:19 localhost podman[107918]: 2025-10-14 09:12:19.496486145 +0000 UTC m=+0.079333786 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Oct 14 05:12:19 localhost podman[107918]: 2025-10-14 09:12:19.676979141 +0000 UTC m=+0.259826732 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20250721.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1) Oct 14 05:12:19 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:12:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20325 DF PROTO=TCP SPT=33458 DPT=9102 SEQ=2972490276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF079950000000001030307) Oct 14 05:12:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20326 DF PROTO=TCP SPT=33458 DPT=9102 SEQ=2972490276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF07D9A0000000001030307) Oct 14 05:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64380 DF PROTO=TCP SPT=52550 DPT=9882 SEQ=2993700080 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF08C5A0000000001030307) Oct 14 05:12:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20328 DF PROTO=TCP SPT=33458 DPT=9102 SEQ=2972490276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0955A0000000001030307) Oct 14 05:12:32 localhost podman[107642]: time="2025-10-14T09:12:32Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Oct 14 05:12:32 localhost systemd[1]: libpod-def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.scope: Deactivated successfully. Oct 14 05:12:32 localhost systemd[1]: libpod-def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.scope: Consumed 6.642s CPU time. Oct 14 05:12:32 localhost podman[107642]: 2025-10-14 09:12:32.615640234 +0000 UTC m=+42.097208521 container died def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-type=git, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Oct 14 05:12:32 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.timer: Deactivated successfully. Oct 14 05:12:32 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e. Oct 14 05:12:32 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed to open /run/systemd/transient/def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: No such file or directory Oct 14 05:12:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e-userdata-shm.mount: Deactivated successfully. Oct 14 05:12:32 localhost systemd[1]: var-lib-containers-storage-overlay-53d9edaa733ed29c02c610d1bdf1d5de99f647c41b4e60d4ee0bce49725f7b4f-merged.mount: Deactivated successfully. Oct 14 05:12:32 localhost podman[107642]: 2025-10-14 09:12:32.66881535 +0000 UTC m=+42.150383627 container cleanup def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1) Oct 14 05:12:32 localhost podman[107642]: ceilometer_agent_compute Oct 14 05:12:32 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.timer: Failed to open /run/systemd/transient/def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.timer: No such file or directory Oct 14 05:12:32 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed to open /run/systemd/transient/def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: No such file or directory Oct 14 05:12:32 localhost podman[107948]: 2025-10-14 09:12:32.712904305 +0000 UTC m=+0.083173426 container cleanup def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible) Oct 14 05:12:32 localhost systemd[1]: libpod-conmon-def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.scope: Deactivated successfully. Oct 14 05:12:32 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.timer: Failed to open /run/systemd/transient/def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.timer: No such file or directory Oct 14 05:12:32 localhost systemd[1]: def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: Failed to open /run/systemd/transient/def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e.service: No such file or directory Oct 14 05:12:32 localhost podman[107961]: 2025-10-14 09:12:32.818748491 +0000 UTC m=+0.073385343 container cleanup def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, release=1, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Oct 14 05:12:32 localhost podman[107961]: ceilometer_agent_compute Oct 14 05:12:32 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Oct 14 05:12:32 localhost systemd[1]: Stopped ceilometer_agent_compute container. Oct 14 05:12:32 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.130s CPU time, no IO. Oct 14 05:12:33 localhost python3.9[108065]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:12:33 localhost systemd[1]: Reloading. Oct 14 05:12:33 localhost podman[108068]: 2025-10-14 09:12:33.721734217 +0000 UTC m=+0.074423205 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, release=1, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:12:33 localhost podman[108067]: 2025-10-14 09:12:33.762944653 +0000 UTC m=+0.118097587 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true) Oct 14 05:12:33 localhost podman[108068]: 2025-10-14 09:12:33.789124583 +0000 UTC m=+0.141813561 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 05:12:33 localhost systemd-rc-local-generator[108134]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:12:33 localhost systemd-sysv-generator[108137]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:12:33 localhost podman[108067]: 2025-10-14 09:12:33.849383188 +0000 UTC m=+0.204536082 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., container_name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:12:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:12:34 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:12:34 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:12:34 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Oct 14 05:12:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64383 DF PROTO=TCP SPT=52550 DPT=9882 SEQ=2993700080 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0A81A0000000001030307) Oct 14 05:12:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1888 DF PROTO=TCP SPT=55416 DPT=9105 SEQ=332721880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0B53E0000000001030307) Oct 14 05:12:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:12:39 localhost podman[108159]: Error: container 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 is not running Oct 14 05:12:39 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Main process exited, code=exited, status=125/n/a Oct 14 05:12:39 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Failed with result 'exit-code'. Oct 14 05:12:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:12:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1889 DF PROTO=TCP SPT=55416 DPT=9105 SEQ=332721880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0B95B0000000001030307) Oct 14 05:12:39 localhost podman[108172]: 2025-10-14 09:12:39.848739453 +0000 UTC m=+0.098204080 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1) Oct 14 05:12:39 localhost podman[108172]: 2025-10-14 09:12:39.884301944 +0000 UTC m=+0.133766531 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, build-date=2025-07-21T13:07:52, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:12:39 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:12:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:12:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4960 writes, 22K keys, 4960 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4960 writes, 649 syncs, 7.64 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4 writes, 8 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:12:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:12:41 localhost podman[108191]: 2025-10-14 09:12:41.763234016 +0000 UTC m=+0.104564597 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true) Oct 14 05:12:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1890 DF PROTO=TCP SPT=55416 DPT=9105 SEQ=332721880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0C15A0000000001030307) Oct 14 05:12:42 localhost podman[108191]: 2025-10-14 09:12:42.113122735 +0000 UTC m=+0.454453316 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, version=17.1.9, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:12:42 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:12:43 localhost podman[108215]: 2025-10-14 09:12:43.755314493 +0000 UTC m=+0.087654165 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=) Oct 14 05:12:43 localhost podman[108215]: 2025-10-14 09:12:43.768948864 +0000 UTC m=+0.101288556 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20250721.1) Oct 14 05:12:43 localhost podman[108215]: unhealthy Oct 14 05:12:43 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:12:43 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:12:43 localhost podman[108216]: 2025-10-14 09:12:43.732873579 +0000 UTC m=+0.066637914 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=ovn_controller, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, release=1, tcib_managed=true, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:12:43 localhost podman[108217]: 2025-10-14 09:12:43.849292191 +0000 UTC m=+0.181013413 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T14:48:37, architecture=x86_64, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 05:12:43 localhost podman[108217]: 2025-10-14 09:12:43.863019186 +0000 UTC m=+0.194740358 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, version=17.1.9, build-date=2025-07-21T14:48:37) Oct 14 05:12:43 localhost podman[108216]: 2025-10-14 09:12:43.865344528 +0000 UTC m=+0.199108863 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Oct 14 05:12:43 localhost podman[108216]: unhealthy Oct 14 05:12:43 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:12:43 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:12:43 localhost podman[108217]: unhealthy Oct 14 05:12:43 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:12:43 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:12:44 localhost systemd[1]: tmp-crun.OK2urs.mount: Deactivated successfully. Oct 14 05:12:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:12:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5551 writes, 24K keys, 5551 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5551 writes, 763 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4 writes, 8 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:12:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1891 DF PROTO=TCP SPT=55416 DPT=9105 SEQ=332721880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0D11A0000000001030307) Oct 14 05:12:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63194 DF PROTO=TCP SPT=44034 DPT=9101 SEQ=3784802963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0D9DA0000000001030307) Oct 14 05:12:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:12:50 localhost podman[108281]: 2025-10-14 09:12:50.011013611 +0000 UTC m=+0.094218247 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1) Oct 14 05:12:50 localhost podman[108281]: 2025-10-14 09:12:50.188132473 +0000 UTC m=+0.271337119 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container) Oct 14 05:12:50 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:12:52 localhost sshd[108309]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:12:52 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:12:52 localhost recover_tripleo_nova_virtqemud[108312]: 62551 Oct 14 05:12:52 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:12:52 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:12:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29558 DF PROTO=TCP SPT=54208 DPT=9102 SEQ=4061358172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0EEC50000000001030307) Oct 14 05:12:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29559 DF PROTO=TCP SPT=54208 DPT=9102 SEQ=4061358172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF0F2DA0000000001030307) Oct 14 05:12:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64346 DF PROTO=TCP SPT=60850 DPT=9882 SEQ=1902545445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1018B0000000001030307) Oct 14 05:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29561 DF PROTO=TCP SPT=54208 DPT=9102 SEQ=4061358172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF10A9B0000000001030307) Oct 14 05:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:13:04 localhost podman[108314]: 2025-10-14 09:13:04.483650048 +0000 UTC m=+0.067413937 container health_status 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9) Oct 14 05:13:04 localhost podman[108314]: 2025-10-14 09:13:04.493944917 +0000 UTC m=+0.077708796 container exec_died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:27:15, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1) Oct 14 05:13:04 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Deactivated successfully. Oct 14 05:13:04 localhost systemd[1]: tmp-crun.vLObZL.mount: Deactivated successfully. Oct 14 05:13:04 localhost podman[108313]: 2025-10-14 09:13:04.545698249 +0000 UTC m=+0.131511682 container health_status 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=2, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd) Oct 14 05:13:04 localhost podman[108313]: 2025-10-14 09:13:04.558057991 +0000 UTC m=+0.143871434 container exec_died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, release=2, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Oct 14 05:13:04 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Deactivated successfully. Oct 14 05:13:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64349 DF PROTO=TCP SPT=60850 DPT=9882 SEQ=1902545445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF11D5A0000000001030307) Oct 14 05:13:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19081 DF PROTO=TCP SPT=39056 DPT=9105 SEQ=957374180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF12A6F0000000001030307) Oct 14 05:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19082 DF PROTO=TCP SPT=39056 DPT=9105 SEQ=957374180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF12E5A0000000001030307) Oct 14 05:13:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:13:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:13:10 localhost systemd[1]: tmp-crun.yvWiNI.mount: Deactivated successfully. Oct 14 05:13:10 localhost podman[108430]: 2025-10-14 09:13:10.265606353 +0000 UTC m=+0.095531267 container health_status d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, release=1, container_name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git) Oct 14 05:13:10 localhost podman[108429]: Error: container 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 is not running Oct 14 05:13:10 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Main process exited, code=exited, status=125/n/a Oct 14 05:13:10 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Failed with result 'exit-code'. Oct 14 05:13:10 localhost podman[108430]: 2025-10-14 09:13:10.303072643 +0000 UTC m=+0.132997507 container exec_died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64) Oct 14 05:13:10 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Deactivated successfully. Oct 14 05:13:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19083 DF PROTO=TCP SPT=39056 DPT=9105 SEQ=957374180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1365A0000000001030307) Oct 14 05:13:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:13:12 localhost podman[108461]: 2025-10-14 09:13:12.495323995 +0000 UTC m=+0.076440708 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container) Oct 14 05:13:12 localhost podman[108461]: 2025-10-14 09:13:12.894243571 +0000 UTC m=+0.475360254 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:13:12 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:13:14 localhost podman[108483]: 2025-10-14 09:13:14.750421032 +0000 UTC m=+0.086295592 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, release=1, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, container_name=ovn_controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:13:14 localhost podman[108483]: 2025-10-14 09:13:14.794054342 +0000 UTC m=+0.129928912 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:13:14 localhost podman[108483]: unhealthy Oct 14 05:13:14 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:13:14 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:13:14 localhost systemd[1]: tmp-crun.rQhk7A.mount: Deactivated successfully. Oct 14 05:13:14 localhost podman[108482]: 2025-10-14 09:13:14.824134593 +0000 UTC m=+0.162567323 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Oct 14 05:13:14 localhost podman[108484]: 2025-10-14 09:13:14.727535304 +0000 UTC m=+0.065435907 container health_status ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step5, distribution-scope=public, container_name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Oct 14 05:13:14 localhost podman[108484]: 2025-10-14 09:13:14.859299382 +0000 UTC m=+0.197200015 container exec_died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, version=17.1.9, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vcs-type=git) Oct 14 05:13:14 localhost podman[108484]: unhealthy Oct 14 05:13:14 localhost podman[108482]: 2025-10-14 09:13:14.866218046 +0000 UTC m=+0.204650766 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, architecture=x86_64) Oct 14 05:13:14 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:13:14 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:13:14 localhost podman[108482]: unhealthy Oct 14 05:13:14 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:13:14 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:13:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19084 DF PROTO=TCP SPT=39056 DPT=9105 SEQ=957374180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1461B0000000001030307) Oct 14 05:13:16 localhost podman[108145]: time="2025-10-14T09:13:16Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Oct 14 05:13:16 localhost systemd[1]: libpod-07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.scope: Deactivated successfully. Oct 14 05:13:16 localhost systemd[1]: libpod-07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.scope: Consumed 6.630s CPU time. Oct 14 05:13:16 localhost podman[108145]: 2025-10-14 09:13:16.136240723 +0000 UTC m=+42.070972594 container stop 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vcs-type=git, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f) Oct 14 05:13:16 localhost podman[108145]: 2025-10-14 09:13:16.172160775 +0000 UTC m=+42.106892716 container died 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, release=1, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:13:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.timer: Deactivated successfully. Oct 14 05:13:16 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638. Oct 14 05:13:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Failed to open /run/systemd/transient/07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: No such file or directory Oct 14 05:13:16 localhost systemd[1]: tmp-crun.6o9gpt.mount: Deactivated successfully. Oct 14 05:13:16 localhost podman[108145]: 2025-10-14 09:13:16.238033544 +0000 UTC m=+42.172765445 container cleanup 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi) Oct 14 05:13:16 localhost podman[108145]: ceilometer_agent_ipmi Oct 14 05:13:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.timer: Failed to open /run/systemd/transient/07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.timer: No such file or directory Oct 14 05:13:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Failed to open /run/systemd/transient/07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: No such file or directory Oct 14 05:13:16 localhost podman[108543]: 2025-10-14 09:13:16.254617308 +0000 UTC m=+0.099530002 container cleanup 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, release=1, tcib_managed=true, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, vcs-type=git) Oct 14 05:13:16 localhost systemd[1]: libpod-conmon-07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.scope: Deactivated successfully. Oct 14 05:13:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.timer: Failed to open /run/systemd/transient/07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.timer: No such file or directory Oct 14 05:13:16 localhost systemd[1]: 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: Failed to open /run/systemd/transient/07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638.service: No such file or directory Oct 14 05:13:16 localhost podman[108558]: 2025-10-14 09:13:16.353608891 +0000 UTC m=+0.068048737 container cleanup 07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git) Oct 14 05:13:16 localhost podman[108558]: ceilometer_agent_ipmi Oct 14 05:13:16 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Oct 14 05:13:16 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Oct 14 05:13:17 localhost systemd[1]: var-lib-containers-storage-overlay-ab5ae649c8d55ad3b61d788cb1580bb9641b83eaaaa1a012a2f15a639ad5659a-merged.mount: Deactivated successfully. Oct 14 05:13:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638-userdata-shm.mount: Deactivated successfully. Oct 14 05:13:17 localhost python3.9[108664]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:17 localhost systemd[1]: Reloading. Oct 14 05:13:17 localhost systemd-rc-local-generator[108694]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:13:17 localhost systemd-sysv-generator[108697]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:13:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:13:17 localhost systemd[1]: Stopping collectd container... Oct 14 05:13:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:49:0d:95 MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52766 SEQ=93525319 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Oct 14 05:13:18 localhost systemd[1]: libpod-0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.scope: Deactivated successfully. Oct 14 05:13:18 localhost systemd[1]: libpod-0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.scope: Consumed 2.360s CPU time. Oct 14 05:13:18 localhost podman[108705]: 2025-10-14 09:13:18.715502854 +0000 UTC m=+1.110983526 container died 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, config_id=tripleo_step3, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1) Oct 14 05:13:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.timer: Deactivated successfully. Oct 14 05:13:18 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8. Oct 14 05:13:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Failed to open /run/systemd/transient/0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: No such file or directory Oct 14 05:13:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8-userdata-shm.mount: Deactivated successfully. Oct 14 05:13:18 localhost podman[108705]: 2025-10-14 09:13:18.782931231 +0000 UTC m=+1.178411873 container cleanup 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, version=17.1.9, config_id=tripleo_step3, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, container_name=collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Oct 14 05:13:18 localhost podman[108705]: collectd Oct 14 05:13:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.timer: Failed to open /run/systemd/transient/0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.timer: No such file or directory Oct 14 05:13:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Failed to open /run/systemd/transient/0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: No such file or directory Oct 14 05:13:18 localhost podman[108717]: 2025-10-14 09:13:18.832026111 +0000 UTC m=+0.092005809 container cleanup 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, distribution-scope=public, name=rhosp17/openstack-collectd) Oct 14 05:13:18 localhost systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:13:18 localhost systemd[1]: libpod-conmon-0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.scope: Deactivated successfully. Oct 14 05:13:18 localhost podman[108744]: error opening file `/run/crun/0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8/status`: No such file or directory Oct 14 05:13:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.timer: Failed to open /run/systemd/transient/0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.timer: No such file or directory Oct 14 05:13:18 localhost systemd[1]: 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: Failed to open /run/systemd/transient/0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8.service: No such file or directory Oct 14 05:13:18 localhost podman[108732]: 2025-10-14 09:13:18.946884226 +0000 UTC m=+0.073447374 container cleanup 0e38c3176cec14501c67e8a59785bee248daf29cf9ac7f4b8834da764d6b77e8 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, container_name=collectd, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, tcib_managed=true, distribution-scope=public) Oct 14 05:13:18 localhost podman[108732]: collectd Oct 14 05:13:18 localhost systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'. Oct 14 05:13:18 localhost systemd[1]: Stopped collectd container. Oct 14 05:13:19 localhost systemd[1]: var-lib-containers-storage-overlay-bf7b7bb08b691d902a723a4644d1ae132381580429bddbb6a2334ee503c366a0-merged.mount: Deactivated successfully. Oct 14 05:13:19 localhost python3.9[108837]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:19 localhost systemd[1]: Reloading. Oct 14 05:13:20 localhost systemd-rc-local-generator[108862]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:13:20 localhost systemd-sysv-generator[108868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:13:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:13:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:13:20 localhost systemd[1]: Stopping iscsid container... Oct 14 05:13:20 localhost podman[108877]: 2025-10-14 09:13:20.431251279 +0000 UTC m=+0.077809770 container health_status 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:13:20 localhost systemd[1]: libpod-2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.scope: Deactivated successfully. Oct 14 05:13:20 localhost systemd[1]: libpod-2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.scope: Consumed 1.190s CPU time. Oct 14 05:13:20 localhost podman[108879]: 2025-10-14 09:13:20.480148322 +0000 UTC m=+0.118534310 container died 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, vcs-type=git, name=rhosp17/openstack-iscsid, version=17.1.9, release=1, tcib_managed=true, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Oct 14 05:13:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.timer: Deactivated successfully. Oct 14 05:13:20 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea. Oct 14 05:13:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Failed to open /run/systemd/transient/2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: No such file or directory Oct 14 05:13:20 localhost systemd[1]: tmp-crun.81wZnH.mount: Deactivated successfully. Oct 14 05:13:20 localhost podman[108879]: 2025-10-14 09:13:20.526812956 +0000 UTC m=+0.165198934 container cleanup 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, container_name=iscsid, release=1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15) Oct 14 05:13:20 localhost podman[108879]: iscsid Oct 14 05:13:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.timer: Failed to open /run/systemd/transient/2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.timer: No such file or directory Oct 14 05:13:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Failed to open /run/systemd/transient/2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: No such file or directory Oct 14 05:13:20 localhost podman[108918]: 2025-10-14 09:13:20.549825558 +0000 UTC m=+0.058592724 container cleanup 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, architecture=x86_64, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, version=17.1.9) Oct 14 05:13:20 localhost systemd[1]: libpod-conmon-2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.scope: Deactivated successfully. Oct 14 05:13:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.timer: Failed to open /run/systemd/transient/2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.timer: No such file or directory Oct 14 05:13:20 localhost systemd[1]: 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: Failed to open /run/systemd/transient/2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea.service: No such file or directory Oct 14 05:13:20 localhost podman[108934]: 2025-10-14 09:13:20.649253505 +0000 UTC m=+0.063902388 container cleanup 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, release=1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible) Oct 14 05:13:20 localhost podman[108934]: iscsid Oct 14 05:13:20 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Oct 14 05:13:20 localhost systemd[1]: Stopped iscsid container. Oct 14 05:13:20 localhost podman[108877]: 2025-10-14 09:13:20.663042882 +0000 UTC m=+0.309601313 container exec_died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, release=1) Oct 14 05:13:20 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Deactivated successfully. Oct 14 05:13:20 localhost systemd[1]: var-lib-containers-storage-overlay-ed0a16e5ece4aebc29c3c8ec7f3de2cd4153e83805d28599b6e1826f63716e16-merged.mount: Deactivated successfully. Oct 14 05:13:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea-userdata-shm.mount: Deactivated successfully. Oct 14 05:13:21 localhost python3.9[109037]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:21 localhost systemd[1]: Reloading. Oct 14 05:13:21 localhost systemd-rc-local-generator[109064]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:13:21 localhost systemd-sysv-generator[109070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:13:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:13:21 localhost systemd[1]: Stopping logrotate_crond container... Oct 14 05:13:21 localhost systemd[1]: tmp-crun.O8IOF2.mount: Deactivated successfully. Oct 14 05:13:21 localhost systemd[1]: libpod-d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.scope: Deactivated successfully. Oct 14 05:13:21 localhost systemd[1]: libpod-d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.scope: Consumed 1.039s CPU time. Oct 14 05:13:21 localhost podman[109078]: 2025-10-14 09:13:21.856847111 +0000 UTC m=+0.088578092 container died d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1) Oct 14 05:13:21 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.timer: Deactivated successfully. Oct 14 05:13:21 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f. Oct 14 05:13:21 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Failed to open /run/systemd/transient/d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: No such file or directory Oct 14 05:13:21 localhost podman[109078]: 2025-10-14 09:13:21.949798589 +0000 UTC m=+0.181529480 container cleanup d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Oct 14 05:13:21 localhost podman[109078]: logrotate_crond Oct 14 05:13:21 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.timer: Failed to open /run/systemd/transient/d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.timer: No such file or directory Oct 14 05:13:21 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Failed to open /run/systemd/transient/d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: No such file or directory Oct 14 05:13:21 localhost podman[109092]: 2025-10-14 09:13:21.967964001 +0000 UTC m=+0.110707127 container cleanup d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.9, name=rhosp17/openstack-cron) Oct 14 05:13:21 localhost systemd[1]: libpod-conmon-d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.scope: Deactivated successfully. Oct 14 05:13:22 localhost podman[109120]: error opening file `/run/crun/d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f/status`: No such file or directory Oct 14 05:13:22 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.timer: Failed to open /run/systemd/transient/d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.timer: No such file or directory Oct 14 05:13:22 localhost systemd[1]: d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: Failed to open /run/systemd/transient/d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f.service: No such file or directory Oct 14 05:13:22 localhost podman[109109]: 2025-10-14 09:13:22.061700453 +0000 UTC m=+0.064311262 container cleanup d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, architecture=x86_64, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true) Oct 14 05:13:22 localhost podman[109109]: logrotate_crond Oct 14 05:13:22 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Oct 14 05:13:22 localhost systemd[1]: Stopped logrotate_crond container. Oct 14 05:13:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:49:0d:95 MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52766 SEQ=93525319 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Oct 14 05:13:22 localhost python3.9[109213]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:22 localhost systemd[1]: var-lib-containers-storage-overlay-93e7bd8fd2f06fab388e6a7e1d321b8be1a1e6e35c116f25c67dac1cc2084007-merged.mount: Deactivated successfully. Oct 14 05:13:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f-userdata-shm.mount: Deactivated successfully. Oct 14 05:13:22 localhost systemd[1]: Reloading. Oct 14 05:13:23 localhost systemd-rc-local-generator[109240]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:13:23 localhost systemd-sysv-generator[109243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:13:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:13:23 localhost systemd[1]: Stopping metrics_qdr container... Oct 14 05:13:23 localhost kernel: qdrouterd[55167]: segfault at 0 ip 00007fbcb2a0b7cb sp 00007fffb203b460 error 4 in libc.so.6[7fbcb29a8000+175000] Oct 14 05:13:23 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Oct 14 05:13:23 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Oct 14 05:13:23 localhost systemd[1]: Started Process Core Dump (PID 109266/UID 0). Oct 14 05:13:23 localhost systemd-coredump[109267]: Resource limits disable core dumping for process 55167 (qdrouterd). Oct 14 05:13:23 localhost systemd-coredump[109267]: Process 55167 (qdrouterd) of user 42465 dumped core. Oct 14 05:13:23 localhost systemd[1]: systemd-coredump@0-109266-0.service: Deactivated successfully. Oct 14 05:13:23 localhost podman[109254]: 2025-10-14 09:13:23.466511242 +0000 UTC m=+0.231522277 container died 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, container_name=metrics_qdr, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Oct 14 05:13:23 localhost systemd[1]: libpod-1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.scope: Deactivated successfully. Oct 14 05:13:23 localhost systemd[1]: libpod-1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.scope: Consumed 29.566s CPU time. Oct 14 05:13:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.timer: Deactivated successfully. Oct 14 05:13:23 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6. Oct 14 05:13:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Failed to open /run/systemd/transient/1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: No such file or directory Oct 14 05:13:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6-userdata-shm.mount: Deactivated successfully. Oct 14 05:13:23 localhost podman[109254]: 2025-10-14 09:13:23.509443871 +0000 UTC m=+0.274454926 container cleanup 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12) Oct 14 05:13:23 localhost podman[109254]: metrics_qdr Oct 14 05:13:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.timer: Failed to open /run/systemd/transient/1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.timer: No such file or directory Oct 14 05:13:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Failed to open /run/systemd/transient/1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: No such file or directory Oct 14 05:13:23 localhost podman[109271]: 2025-10-14 09:13:23.531512994 +0000 UTC m=+0.046014625 container cleanup 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, release=1, version=17.1.9, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Oct 14 05:13:23 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Oct 14 05:13:23 localhost systemd[1]: libpod-conmon-1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.scope: Deactivated successfully. Oct 14 05:13:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.timer: Failed to open /run/systemd/transient/1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.timer: No such file or directory Oct 14 05:13:23 localhost systemd[1]: 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: Failed to open /run/systemd/transient/1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6.service: No such file or directory Oct 14 05:13:23 localhost podman[109286]: 2025-10-14 09:13:23.607777835 +0000 UTC m=+0.051192906 container cleanup 1aec9f100a916c850eaee3b034dce9ea05137a9acd0cb3439688ce3d9b7357b6 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8203f25645ef4c13974e350f23db228e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, build-date=2025-07-21T13:07:59, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:13:23 localhost podman[109286]: metrics_qdr Oct 14 05:13:23 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Oct 14 05:13:23 localhost systemd[1]: Stopped metrics_qdr container. Oct 14 05:13:23 localhost systemd[1]: var-lib-containers-storage-overlay-168207db095cdd373b28e32e9bd8a2aa29e7cbcdf9040af1b44bb5a093e7f31e-merged.mount: Deactivated successfully. Oct 14 05:13:24 localhost python3.9[109390]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37149 DF PROTO=TCP SPT=44894 DPT=9102 SEQ=1714030963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1681A0000000001030307) Oct 14 05:13:25 localhost python3.9[109483]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:26 localhost python3.9[109576]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:26 localhost python3.9[109669]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:13:26 localhost systemd[1]: Reloading. Oct 14 05:13:26 localhost systemd-rc-local-generator[109693]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:13:26 localhost systemd-sysv-generator[109699]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:13:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:13:27 localhost systemd[1]: Stopping nova_compute container... Oct 14 05:13:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10395 DF PROTO=TCP SPT=50994 DPT=9882 SEQ=3425009413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF176BB0000000001030307) Oct 14 05:13:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37151 DF PROTO=TCP SPT=44894 DPT=9102 SEQ=1714030963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF17FDB0000000001030307) Oct 14 05:13:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10398 DF PROTO=TCP SPT=50994 DPT=9882 SEQ=3425009413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1929B0000000001030307) Oct 14 05:13:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3713 DF PROTO=TCP SPT=44498 DPT=9105 SEQ=4026041804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF19F9E0000000001030307) Oct 14 05:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3714 DF PROTO=TCP SPT=44498 DPT=9105 SEQ=4026041804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1A39A0000000001030307) Oct 14 05:13:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3715 DF PROTO=TCP SPT=44498 DPT=9105 SEQ=4026041804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1AB9A0000000001030307) Oct 14 05:13:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:13:43 localhost systemd[1]: tmp-crun.qrE59A.mount: Deactivated successfully. Oct 14 05:13:43 localhost podman[109722]: 2025-10-14 09:13:43.233366198 +0000 UTC m=+0.074984842 container health_status 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Oct 14 05:13:43 localhost podman[109722]: 2025-10-14 09:13:43.533200238 +0000 UTC m=+0.374818942 container exec_died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=nova_migration_target) Oct 14 05:13:43 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Deactivated successfully. Oct 14 05:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:13:45 localhost podman[109747]: Error: container ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 is not running Oct 14 05:13:45 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Main process exited, code=exited, status=125/n/a Oct 14 05:13:45 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed with result 'exit-code'. Oct 14 05:13:45 localhost podman[109745]: 2025-10-14 09:13:45.536491251 +0000 UTC m=+0.124026029 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, release=1, build-date=2025-07-21T16:28:53) Oct 14 05:13:45 localhost podman[109745]: 2025-10-14 09:13:45.58006907 +0000 UTC m=+0.167603878 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, batch=17.1_20250721.1, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team) Oct 14 05:13:45 localhost podman[109745]: unhealthy Oct 14 05:13:45 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:13:45 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:13:45 localhost systemd[1]: tmp-crun.T2oYNa.mount: Deactivated successfully. Oct 14 05:13:45 localhost podman[109746]: 2025-10-14 09:13:45.665779533 +0000 UTC m=+0.248160492 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, config_id=tripleo_step4, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:13:45 localhost podman[109746]: 2025-10-14 09:13:45.6782884 +0000 UTC m=+0.260669379 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245) Oct 14 05:13:45 localhost podman[109746]: unhealthy Oct 14 05:13:45 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:13:45 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:13:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3716 DF PROTO=TCP SPT=44498 DPT=9105 SEQ=4026041804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1BB5B0000000001030307) Oct 14 05:13:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9790 DF PROTO=TCP SPT=43180 DPT=9101 SEQ=2252299503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1C41A0000000001030307) Oct 14 05:13:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4289 DF PROTO=TCP SPT=50116 DPT=9102 SEQ=940307072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1D9250000000001030307) Oct 14 05:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4290 DF PROTO=TCP SPT=50116 DPT=9102 SEQ=940307072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1DD1A0000000001030307) Oct 14 05:13:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2590 DF PROTO=TCP SPT=53264 DPT=9882 SEQ=2035632970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1EBEB0000000001030307) Oct 14 05:14:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4292 DF PROTO=TCP SPT=50116 DPT=9102 SEQ=940307072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF1F4DA0000000001030307) Oct 14 05:14:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2593 DF PROTO=TCP SPT=53264 DPT=9882 SEQ=2035632970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2079B0000000001030307) Oct 14 05:14:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14801 DF PROTO=TCP SPT=54960 DPT=9105 SEQ=2313393081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF214CE0000000001030307) Oct 14 05:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:49:0d:95 MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52766 SEQ=93525319 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Oct 14 05:14:09 localhost podman[109710]: time="2025-10-14T09:14:09Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Oct 14 05:14:09 localhost systemd[1]: session-c11.scope: Deactivated successfully. Oct 14 05:14:09 localhost systemd[1]: libpod-ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.scope: Deactivated successfully. Oct 14 05:14:09 localhost systemd[1]: libpod-ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.scope: Consumed 37.237s CPU time. Oct 14 05:14:09 localhost podman[109710]: 2025-10-14 09:14:09.283511687 +0000 UTC m=+42.115736039 container died ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute) Oct 14 05:14:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.timer: Deactivated successfully. Oct 14 05:14:09 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2. Oct 14 05:14:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed to open /run/systemd/transient/ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: No such file or directory Oct 14 05:14:09 localhost systemd[1]: var-lib-containers-storage-overlay-76cf0246342f1e521e8960667a4518c270830e003f626a6d56414187630bdbfc-merged.mount: Deactivated successfully. Oct 14 05:14:09 localhost podman[109710]: 2025-10-14 09:14:09.340970826 +0000 UTC m=+42.173195148 container cleanup ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, name=rhosp17/openstack-nova-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-07-21T14:48:37, container_name=nova_compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12) Oct 14 05:14:09 localhost podman[109710]: nova_compute Oct 14 05:14:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.timer: Failed to open /run/systemd/transient/ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.timer: No such file or directory Oct 14 05:14:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed to open /run/systemd/transient/ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: No such file or directory Oct 14 05:14:09 localhost podman[109858]: 2025-10-14 09:14:09.354261487 +0000 UTC m=+0.062166995 container cleanup ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, container_name=nova_compute, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9) Oct 14 05:14:09 localhost systemd[1]: libpod-conmon-ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.scope: Deactivated successfully. Oct 14 05:14:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.timer: Failed to open /run/systemd/transient/ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.timer: No such file or directory Oct 14 05:14:09 localhost systemd[1]: ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: Failed to open /run/systemd/transient/ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2.service: No such file or directory Oct 14 05:14:09 localhost podman[109872]: 2025-10-14 09:14:09.435951506 +0000 UTC m=+0.048541314 container cleanup ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.9, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, release=1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:14:09 localhost podman[109872]: nova_compute Oct 14 05:14:09 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Oct 14 05:14:09 localhost systemd[1]: Stopped nova_compute container. Oct 14 05:14:09 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.151s CPU time, no IO. Oct 14 05:14:10 localhost python3.9[109978]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:14:10 localhost systemd[1]: Reloading. Oct 14 05:14:10 localhost systemd-rc-local-generator[110001]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:14:10 localhost systemd-sysv-generator[110004]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:14:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:14:10 localhost systemd[1]: Stopping nova_migration_target container... Oct 14 05:14:10 localhost systemd[1]: libpod-7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.scope: Deactivated successfully. Oct 14 05:14:10 localhost systemd[1]: libpod-7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.scope: Consumed 33.885s CPU time. Oct 14 05:14:10 localhost podman[110018]: 2025-10-14 09:14:10.651847899 +0000 UTC m=+0.065834809 container died 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, build-date=2025-07-21T14:48:37, tcib_managed=true, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, release=1, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Oct 14 05:14:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.timer: Deactivated successfully. Oct 14 05:14:10 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884. Oct 14 05:14:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Failed to open /run/systemd/transient/7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: No such file or directory Oct 14 05:14:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884-userdata-shm.mount: Deactivated successfully. Oct 14 05:14:10 localhost systemd[1]: var-lib-containers-storage-overlay-0f85b4a61f0484c7d6d1230e2bc736bd8398f5346eb8306d97c0ce215dfc5ab2-merged.mount: Deactivated successfully. Oct 14 05:14:10 localhost podman[110018]: 2025-10-14 09:14:10.712275169 +0000 UTC m=+0.126262059 container cleanup 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1) Oct 14 05:14:10 localhost podman[110018]: nova_migration_target Oct 14 05:14:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.timer: Failed to open /run/systemd/transient/7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.timer: No such file or directory Oct 14 05:14:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Failed to open /run/systemd/transient/7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: No such file or directory Oct 14 05:14:10 localhost podman[110030]: 2025-10-14 09:14:10.749878534 +0000 UTC m=+0.083244378 container cleanup 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Oct 14 05:14:10 localhost systemd[1]: libpod-conmon-7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.scope: Deactivated successfully. Oct 14 05:14:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.timer: Failed to open /run/systemd/transient/7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.timer: No such file or directory Oct 14 05:14:10 localhost systemd[1]: 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: Failed to open /run/systemd/transient/7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884.service: No such file or directory Oct 14 05:14:10 localhost podman[110045]: 2025-10-14 09:14:10.833057827 +0000 UTC m=+0.055223809 container cleanup 7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, release=1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37) Oct 14 05:14:10 localhost podman[110045]: nova_migration_target Oct 14 05:14:10 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Oct 14 05:14:10 localhost systemd[1]: Stopped nova_migration_target container. Oct 14 05:14:11 localhost python3.9[110149]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:14:11 localhost systemd[1]: Reloading. Oct 14 05:14:11 localhost systemd-rc-local-generator[110176]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:14:11 localhost systemd-sysv-generator[110179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:14:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:14:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14803 DF PROTO=TCP SPT=54960 DPT=9105 SEQ=2313393081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF220DB0000000001030307) Oct 14 05:14:11 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Oct 14 05:14:11 localhost systemd[1]: libpod-5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6.scope: Deactivated successfully. Oct 14 05:14:11 localhost podman[110189]: 2025-10-14 09:14:11.971455393 +0000 UTC m=+0.060244376 container died 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, batch=17.1_20250721.1, version=17.1.9, io.openshift.expose-services=, release=2, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0) Oct 14 05:14:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6-userdata-shm.mount: Deactivated successfully. Oct 14 05:14:12 localhost systemd[1]: var-lib-containers-storage-overlay-082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287-merged.mount: Deactivated successfully. Oct 14 05:14:12 localhost podman[110189]: 2025-10-14 09:14:12.019986974 +0000 UTC m=+0.108775877 container cleanup 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, build-date=2025-07-21T14:56:59, version=17.1.9, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, release=2, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, container_name=nova_virtlogd_wrapper, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:14:12 localhost podman[110189]: nova_virtlogd_wrapper Oct 14 05:14:12 localhost podman[110204]: 2025-10-14 09:14:12.095249114 +0000 UTC m=+0.107763797 container cleanup 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=2, tcib_managed=true, build-date=2025-07-21T14:56:59, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, container_name=nova_virtlogd_wrapper, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:14:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:14:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:14:15 localhost podman[110235]: 2025-10-14 09:14:15.737078322 +0000 UTC m=+0.077479889 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git) Oct 14 05:14:15 localhost podman[110235]: 2025-10-14 09:14:15.7861359 +0000 UTC m=+0.126537497 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T16:28:53) Oct 14 05:14:15 localhost podman[110235]: unhealthy Oct 14 05:14:15 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:14:15 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:14:15 localhost podman[110254]: 2025-10-14 09:14:15.829810852 +0000 UTC m=+0.081733470 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container) Oct 14 05:14:15 localhost podman[110254]: 2025-10-14 09:14:15.845070384 +0000 UTC m=+0.096993002 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Oct 14 05:14:15 localhost podman[110254]: unhealthy Oct 14 05:14:15 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:14:15 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:14:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14804 DF PROTO=TCP SPT=54960 DPT=9105 SEQ=2313393081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2309A0000000001030307) Oct 14 05:14:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47698 DF PROTO=TCP SPT=35188 DPT=9101 SEQ=965772130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2395A0000000001030307) Oct 14 05:14:19 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 05:14:19 localhost systemd[84643]: Activating special unit Exit the Session... Oct 14 05:14:19 localhost systemd[84643]: Removed slice User Background Tasks Slice. Oct 14 05:14:19 localhost systemd[84643]: Stopped target Main User Target. Oct 14 05:14:19 localhost systemd[84643]: Stopped target Basic System. Oct 14 05:14:19 localhost systemd[84643]: Stopped target Paths. Oct 14 05:14:19 localhost systemd[84643]: Stopped target Sockets. Oct 14 05:14:19 localhost systemd[84643]: Stopped target Timers. Oct 14 05:14:19 localhost systemd[84643]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 05:14:19 localhost systemd[84643]: Closed D-Bus User Message Bus Socket. Oct 14 05:14:19 localhost systemd[84643]: Stopped Create User's Volatile Files and Directories. Oct 14 05:14:19 localhost systemd[84643]: Removed slice User Application Slice. Oct 14 05:14:19 localhost systemd[84643]: Reached target Shutdown. Oct 14 05:14:19 localhost systemd[84643]: Finished Exit the Session. Oct 14 05:14:19 localhost systemd[84643]: Reached target Exit the Session. Oct 14 05:14:19 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 05:14:19 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 05:14:19 localhost systemd[1]: user@0.service: Consumed 4.559s CPU time, no IO. Oct 14 05:14:19 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 05:14:19 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 05:14:19 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 05:14:19 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 05:14:19 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 05:14:19 localhost systemd[1]: user-0.slice: Consumed 5.492s CPU time. Oct 14 05:14:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12192 DF PROTO=TCP SPT=37344 DPT=9102 SEQ=1076352224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF24E540000000001030307) Oct 14 05:14:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12193 DF PROTO=TCP SPT=37344 DPT=9102 SEQ=1076352224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2525B0000000001030307) Oct 14 05:14:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19535 DF PROTO=TCP SPT=60038 DPT=9882 SEQ=3737045660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2611A0000000001030307) Oct 14 05:14:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12195 DF PROTO=TCP SPT=37344 DPT=9102 SEQ=1076352224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF26A1A0000000001030307) Oct 14 05:14:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:14:34 localhost recover_tripleo_nova_virtqemud[110277]: 62551 Oct 14 05:14:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Oct 14 05:14:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Oct 14 05:14:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19538 DF PROTO=TCP SPT=60038 DPT=9882 SEQ=3737045660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF27CDB0000000001030307) Oct 14 05:14:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54203 DF PROTO=TCP SPT=32834 DPT=9105 SEQ=4084403190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF289FE0000000001030307) Oct 14 05:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54204 DF PROTO=TCP SPT=32834 DPT=9105 SEQ=4084403190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF28E1A0000000001030307) Oct 14 05:14:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54205 DF PROTO=TCP SPT=32834 DPT=9105 SEQ=4084403190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2961A0000000001030307) Oct 14 05:14:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:14:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54206 DF PROTO=TCP SPT=32834 DPT=9105 SEQ=4084403190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2A5DB0000000001030307) Oct 14 05:14:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:14:46 localhost systemd[1]: tmp-crun.Yvcy9Z.mount: Deactivated successfully. Oct 14 05:14:46 localhost podman[110278]: 2025-10-14 09:14:46.012971955 +0000 UTC m=+0.093030270 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12) Oct 14 05:14:46 localhost systemd[1]: tmp-crun.Vwg3KK.mount: Deactivated successfully. Oct 14 05:14:46 localhost podman[110278]: 2025-10-14 09:14:46.058827465 +0000 UTC m=+0.138885870 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1) Oct 14 05:14:46 localhost podman[110278]: unhealthy Oct 14 05:14:46 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:14:46 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:14:46 localhost podman[110279]: 2025-10-14 09:14:46.065625675 +0000 UTC m=+0.138179577 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:14:46 localhost podman[110279]: 2025-10-14 09:14:46.150272965 +0000 UTC m=+0.222826907 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, version=17.1.9, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:14:46 localhost podman[110279]: unhealthy Oct 14 05:14:46 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:14:46 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:14:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13305 DF PROTO=TCP SPT=57676 DPT=9101 SEQ=3682583823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2AE9A0000000001030307) Oct 14 05:14:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12331 DF PROTO=TCP SPT=47122 DPT=9102 SEQ=1784621883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2C3850000000001030307) Oct 14 05:14:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12332 DF PROTO=TCP SPT=47122 DPT=9102 SEQ=1784621883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2C79A0000000001030307) Oct 14 05:14:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44512 DF PROTO=TCP SPT=39046 DPT=9882 SEQ=2455246186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2D64B0000000001030307) Oct 14 05:15:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12334 DF PROTO=TCP SPT=47122 DPT=9102 SEQ=1784621883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2DF5A0000000001030307) Oct 14 05:15:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44515 DF PROTO=TCP SPT=39046 DPT=9882 SEQ=2455246186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2F21A0000000001030307) Oct 14 05:15:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22648 DF PROTO=TCP SPT=50760 DPT=9105 SEQ=371596692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF2FF2F0000000001030307) Oct 14 05:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22649 DF PROTO=TCP SPT=50760 DPT=9105 SEQ=371596692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3031B0000000001030307) Oct 14 05:15:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22650 DF PROTO=TCP SPT=50760 DPT=9105 SEQ=371596692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF30B1A0000000001030307) Oct 14 05:15:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22651 DF PROTO=TCP SPT=50760 DPT=9105 SEQ=371596692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF31ADA0000000001030307) Oct 14 05:15:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:15:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:15:16 localhost podman[110447]: 2025-10-14 09:15:16.477455549 +0000 UTC m=+0.063688983 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1) Oct 14 05:15:16 localhost podman[110447]: 2025-10-14 09:15:16.521024588 +0000 UTC m=+0.107258012 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vcs-type=git, version=17.1.9, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:15:16 localhost podman[110447]: unhealthy Oct 14 05:15:16 localhost systemd[1]: tmp-crun.czGxkQ.mount: Deactivated successfully. Oct 14 05:15:16 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:15:16 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:15:16 localhost podman[110446]: 2025-10-14 09:15:16.537623231 +0000 UTC m=+0.125001750 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4) Oct 14 05:15:16 localhost podman[110446]: 2025-10-14 09:15:16.579124005 +0000 UTC m=+0.166502544 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 05:15:16 localhost podman[110446]: unhealthy Oct 14 05:15:16 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:15:16 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:15:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15453 DF PROTO=TCP SPT=38518 DPT=9101 SEQ=2752263105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF323DB0000000001030307) Oct 14 05:15:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52369 DF PROTO=TCP SPT=59034 DPT=9102 SEQ=3085530595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF338B50000000001030307) Oct 14 05:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52370 DF PROTO=TCP SPT=59034 DPT=9102 SEQ=3085530595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF33CDA0000000001030307) Oct 14 05:15:27 localhost sshd[110484]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:15:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2419 DF PROTO=TCP SPT=47460 DPT=9882 SEQ=3369305963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF34B7B0000000001030307) Oct 14 05:15:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52372 DF PROTO=TCP SPT=59034 DPT=9102 SEQ=3085530595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3549A0000000001030307) Oct 14 05:15:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2422 DF PROTO=TCP SPT=47460 DPT=9882 SEQ=3369305963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3675A0000000001030307) Oct 14 05:15:36 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Oct 14 05:15:36 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61903 (conmon) with signal SIGKILL. Oct 14 05:15:36 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Oct 14 05:15:36 localhost systemd[1]: libpod-conmon-5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6.scope: Deactivated successfully. Oct 14 05:15:36 localhost podman[110497]: error opening file `/run/crun/5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6/status`: No such file or directory Oct 14 05:15:36 localhost systemd[1]: tmp-crun.3AuFgH.mount: Deactivated successfully. Oct 14 05:15:36 localhost podman[110485]: 2025-10-14 09:15:36.237919002 +0000 UTC m=+0.074288340 container cleanup 5697944f09f52bb27341b791937b3551397f253570b866c4cb3214e8127000d6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, container_name=nova_virtlogd_wrapper, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, tcib_managed=true, maintainer=OpenStack TripleO Team) Oct 14 05:15:36 localhost podman[110485]: nova_virtlogd_wrapper Oct 14 05:15:36 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Oct 14 05:15:36 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Oct 14 05:15:37 localhost python3.9[110590]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:15:37 localhost systemd[1]: Reloading. Oct 14 05:15:37 localhost systemd-rc-local-generator[110615]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:15:37 localhost systemd-sysv-generator[110620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:15:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:15:37 localhost systemd[1]: Stopping nova_virtnodedevd container... Oct 14 05:15:37 localhost systemd[1]: libpod-b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c.scope: Deactivated successfully. Oct 14 05:15:37 localhost systemd[1]: libpod-b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c.scope: Consumed 1.478s CPU time. Oct 14 05:15:37 localhost podman[110631]: 2025-10-14 09:15:37.557174115 +0000 UTC m=+0.081149533 container died b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, build-date=2025-07-21T14:56:59, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, distribution-scope=public) Oct 14 05:15:37 localhost systemd[1]: tmp-crun.CAwzXo.mount: Deactivated successfully. Oct 14 05:15:37 localhost podman[110631]: 2025-10-14 09:15:37.591954891 +0000 UTC m=+0.115930269 container cleanup b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, release=2, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, name=rhosp17/openstack-nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}) Oct 14 05:15:37 localhost podman[110631]: nova_virtnodedevd Oct 14 05:15:37 localhost podman[110645]: 2025-10-14 09:15:37.640612447 +0000 UTC m=+0.067896293 container cleanup b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtnodedevd, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, release=2, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=) Oct 14 05:15:37 localhost systemd[1]: libpod-conmon-b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c.scope: Deactivated successfully. Oct 14 05:15:37 localhost podman[110672]: error opening file `/run/crun/b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c/status`: No such file or directory Oct 14 05:15:37 localhost podman[110661]: 2025-10-14 09:15:37.731618893 +0000 UTC m=+0.060900506 container cleanup b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, distribution-scope=public, build-date=2025-07-21T14:56:59, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.expose-services=, release=2, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, container_name=nova_virtnodedevd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true) Oct 14 05:15:37 localhost podman[110661]: nova_virtnodedevd Oct 14 05:15:37 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Oct 14 05:15:37 localhost systemd[1]: Stopped nova_virtnodedevd container. Oct 14 05:15:38 localhost systemd[1]: var-lib-containers-storage-overlay-d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb-merged.mount: Deactivated successfully. Oct 14 05:15:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8448d328f64def3e65a4b47453ff3736b00383ab3f8c127ff29df806213ba2c-userdata-shm.mount: Deactivated successfully. Oct 14 05:15:38 localhost python3.9[110766]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:15:38 localhost systemd[1]: Reloading. Oct 14 05:15:38 localhost systemd-rc-local-generator[110788]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:15:38 localhost systemd-sysv-generator[110792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:15:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:15:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42603 DF PROTO=TCP SPT=35346 DPT=9105 SEQ=3422966727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3745E0000000001030307) Oct 14 05:15:38 localhost systemd[1]: Stopping nova_virtproxyd container... Oct 14 05:15:39 localhost systemd[1]: libpod-f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f.scope: Deactivated successfully. Oct 14 05:15:39 localhost podman[110807]: 2025-10-14 09:15:39.046804339 +0000 UTC m=+0.093086152 container died f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64, io.buildah.version=1.33.12, container_name=nova_virtproxyd, name=rhosp17/openstack-nova-libvirt, batch=17.1_20250721.1, vcs-type=git, maintainer=OpenStack TripleO Team, release=2, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, config_id=tripleo_step3, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:15:39 localhost podman[110807]: 2025-10-14 09:15:39.085046053 +0000 UTC m=+0.131327866 container cleanup f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=2, container_name=nova_virtproxyd, batch=17.1_20250721.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=) Oct 14 05:15:39 localhost podman[110807]: nova_virtproxyd Oct 14 05:15:39 localhost podman[110820]: 2025-10-14 09:15:39.139303013 +0000 UTC m=+0.079724319 container cleanup f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=nova_virtproxyd, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=2, io.buildah.version=1.33.12, build-date=2025-07-21T14:56:59, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team) Oct 14 05:15:39 localhost systemd[1]: libpod-conmon-f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f.scope: Deactivated successfully. Oct 14 05:15:39 localhost podman[110847]: error opening file `/run/crun/f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f/status`: No such file or directory Oct 14 05:15:39 localhost podman[110836]: 2025-10-14 09:15:39.237928936 +0000 UTC m=+0.070131202 container cleanup f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, release=2, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-07-21T14:56:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_virtproxyd, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1) Oct 14 05:15:39 localhost podman[110836]: nova_virtproxyd Oct 14 05:15:39 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Oct 14 05:15:39 localhost systemd[1]: Stopped nova_virtproxyd container. Oct 14 05:15:39 localhost systemd[1]: var-lib-containers-storage-overlay-32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9-merged.mount: Deactivated successfully. Oct 14 05:15:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f61c6985320c4e3d1194575a3954495dcc2f873d72a594570e6c3b74cf93fd8f-userdata-shm.mount: Deactivated successfully. Oct 14 05:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42604 DF PROTO=TCP SPT=35346 DPT=9105 SEQ=3422966727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3785A0000000001030307) Oct 14 05:15:39 localhost python3.9[110940]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:15:40 localhost systemd[1]: Reloading. Oct 14 05:15:40 localhost systemd-sysv-generator[110972]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:15:40 localhost systemd-rc-local-generator[110967]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:15:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:15:40 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Oct 14 05:15:40 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Main process exited, code=killed, status=15/TERM Oct 14 05:15:40 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Failed with result 'signal'. Oct 14 05:15:40 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud. Oct 14 05:15:40 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Oct 14 05:15:40 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Oct 14 05:15:40 localhost systemd[1]: Stopping nova_virtqemud container... Oct 14 05:15:40 localhost systemd[1]: tmp-crun.1n7ipj.mount: Deactivated successfully. Oct 14 05:15:40 localhost systemd[1]: libpod-99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2.scope: Deactivated successfully. Oct 14 05:15:40 localhost systemd[1]: libpod-99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2.scope: Consumed 2.887s CPU time. Oct 14 05:15:40 localhost podman[110981]: 2025-10-14 09:15:40.435322055 +0000 UTC m=+0.079285854 container died 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0) Oct 14 05:15:40 localhost podman[110981]: 2025-10-14 09:15:40.466415538 +0000 UTC m=+0.110379297 container cleanup 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:56:59, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtqemud, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, maintainer=OpenStack TripleO Team, tcib_managed=true) Oct 14 05:15:40 localhost podman[110981]: nova_virtqemud Oct 14 05:15:40 localhost podman[110995]: 2025-10-14 09:15:40.509820842 +0000 UTC m=+0.057701008 container cleanup 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2025-07-21T14:56:59, release=2, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, container_name=nova_virtqemud, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, config_id=tripleo_step3) Oct 14 05:15:40 localhost systemd[1]: var-lib-containers-storage-overlay-55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754-merged.mount: Deactivated successfully. Oct 14 05:15:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2-userdata-shm.mount: Deactivated successfully. Oct 14 05:15:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42605 DF PROTO=TCP SPT=35346 DPT=9105 SEQ=3422966727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3805B0000000001030307) Oct 14 05:15:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42606 DF PROTO=TCP SPT=35346 DPT=9105 SEQ=3422966727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3901A0000000001030307) Oct 14 05:15:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:15:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:15:46 localhost podman[111012]: 2025-10-14 09:15:46.743164861 +0000 UTC m=+0.085008193 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T16:28:53, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 05:15:46 localhost podman[111012]: 2025-10-14 09:15:46.762260222 +0000 UTC m=+0.104103504 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3) Oct 14 05:15:46 localhost podman[111012]: unhealthy Oct 14 05:15:46 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:15:46 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:15:46 localhost podman[111013]: 2025-10-14 09:15:46.851587947 +0000 UTC m=+0.190973683 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, release=1, version=17.1.9, architecture=x86_64) Oct 14 05:15:46 localhost podman[111013]: 2025-10-14 09:15:46.865970872 +0000 UTC m=+0.205356698 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12) Oct 14 05:15:46 localhost podman[111013]: unhealthy Oct 14 05:15:46 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:15:46 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:15:47 localhost sshd[111047]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:15:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44089 DF PROTO=TCP SPT=52042 DPT=9101 SEQ=2666693964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF398DA0000000001030307) Oct 14 05:15:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8675 DF PROTO=TCP SPT=41656 DPT=9102 SEQ=2077140008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3ADE50000000001030307) Oct 14 05:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8676 DF PROTO=TCP SPT=41656 DPT=9102 SEQ=2077140008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3B1DB0000000001030307) Oct 14 05:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17667 DF PROTO=TCP SPT=56054 DPT=9882 SEQ=3500276288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3C0AB0000000001030307) Oct 14 05:15:59 localhost sshd[111049]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:16:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8678 DF PROTO=TCP SPT=41656 DPT=9102 SEQ=2077140008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3C99A0000000001030307) Oct 14 05:16:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17670 DF PROTO=TCP SPT=56054 DPT=9882 SEQ=3500276288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3DC5B0000000001030307) Oct 14 05:16:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=794 DF PROTO=TCP SPT=51818 DPT=9105 SEQ=90596209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3E98E0000000001030307) Oct 14 05:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=795 DF PROTO=TCP SPT=51818 DPT=9105 SEQ=90596209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3ED9A0000000001030307) Oct 14 05:16:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=796 DF PROTO=TCP SPT=51818 DPT=9105 SEQ=90596209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF3F59B0000000001030307) Oct 14 05:16:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=797 DF PROTO=TCP SPT=51818 DPT=9105 SEQ=90596209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4055A0000000001030307) Oct 14 05:16:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:16:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:16:17 localhost podman[111129]: 2025-10-14 09:16:17.018939096 +0000 UTC m=+0.101192166 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T16:28:53, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn) Oct 14 05:16:17 localhost podman[111130]: 2025-10-14 09:16:17.054602771 +0000 UTC m=+0.136787439 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible) Oct 14 05:16:17 localhost podman[111129]: 2025-10-14 09:16:17.086111257 +0000 UTC m=+0.168364287 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, distribution-scope=public, io.openshift.expose-services=) Oct 14 05:16:17 localhost podman[111129]: unhealthy Oct 14 05:16:17 localhost podman[111130]: 2025-10-14 09:16:17.094559848 +0000 UTC m=+0.176744486 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.9) Oct 14 05:16:17 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:16:17 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:16:17 localhost podman[111130]: unhealthy Oct 14 05:16:17 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:16:17 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:16:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23623 DF PROTO=TCP SPT=38504 DPT=9101 SEQ=1241696832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF40E1A0000000001030307) Oct 14 05:16:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36208 DF PROTO=TCP SPT=47270 DPT=9102 SEQ=3048207069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF423140000000001030307) Oct 14 05:16:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36209 DF PROTO=TCP SPT=47270 DPT=9102 SEQ=3048207069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4271A0000000001030307) Oct 14 05:16:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47280 DF PROTO=TCP SPT=38812 DPT=9882 SEQ=263092035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF435DA0000000001030307) Oct 14 05:16:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36211 DF PROTO=TCP SPT=47270 DPT=9102 SEQ=3048207069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF43EDA0000000001030307) Oct 14 05:16:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47283 DF PROTO=TCP SPT=38812 DPT=9882 SEQ=263092035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4519A0000000001030307) Oct 14 05:16:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56309 DF PROTO=TCP SPT=40806 DPT=9105 SEQ=2847503052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF45EBF0000000001030307) Oct 14 05:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56310 DF PROTO=TCP SPT=40806 DPT=9105 SEQ=2847503052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF462DA0000000001030307) Oct 14 05:16:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56311 DF PROTO=TCP SPT=40806 DPT=9105 SEQ=2847503052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF46ADA0000000001030307) Oct 14 05:16:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56312 DF PROTO=TCP SPT=40806 DPT=9105 SEQ=2847503052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF47A9A0000000001030307) Oct 14 05:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:16:47 localhost podman[111169]: 2025-10-14 09:16:47.224927568 +0000 UTC m=+0.068302418 container health_status 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64) Oct 14 05:16:47 localhost podman[111169]: 2025-10-14 09:16:47.240056196 +0000 UTC m=+0.083431036 container exec_died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 05:16:47 localhost podman[111169]: unhealthy Oct 14 05:16:47 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:16:47 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed with result 'exit-code'. Oct 14 05:16:47 localhost podman[111170]: 2025-10-14 09:16:47.294498864 +0000 UTC m=+0.132775166 container health_status a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, release=1) Oct 14 05:16:47 localhost podman[111170]: 2025-10-14 09:16:47.332559732 +0000 UTC m=+0.170836034 container exec_died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, distribution-scope=public, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1) Oct 14 05:16:47 localhost podman[111170]: unhealthy Oct 14 05:16:47 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:16:47 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed with result 'exit-code'. Oct 14 05:16:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9353 DF PROTO=TCP SPT=37182 DPT=9101 SEQ=1979821569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4835B0000000001030307) Oct 14 05:16:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24092 DF PROTO=TCP SPT=53314 DPT=9102 SEQ=1121004164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF498450000000001030307) Oct 14 05:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24093 DF PROTO=TCP SPT=53314 DPT=9102 SEQ=1121004164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF49C5A0000000001030307) Oct 14 05:16:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14055 DF PROTO=TCP SPT=35424 DPT=9882 SEQ=597726478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4AB0B0000000001030307) Oct 14 05:17:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24095 DF PROTO=TCP SPT=53314 DPT=9102 SEQ=1121004164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4B41A0000000001030307) Oct 14 05:17:04 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Oct 14 05:17:04 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 62547 (conmon) with signal SIGKILL. Oct 14 05:17:04 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Oct 14 05:17:04 localhost systemd[1]: libpod-conmon-99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2.scope: Deactivated successfully. Oct 14 05:17:04 localhost podman[111222]: error opening file `/run/crun/99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2/status`: No such file or directory Oct 14 05:17:04 localhost podman[111209]: 2025-10-14 09:17:04.726103976 +0000 UTC m=+0.069711171 container cleanup 99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, batch=17.1_20250721.1, name=rhosp17/openstack-nova-libvirt, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_virtqemud) Oct 14 05:17:04 localhost podman[111209]: nova_virtqemud Oct 14 05:17:04 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Oct 14 05:17:04 localhost systemd[1]: Stopped nova_virtqemud container. Oct 14 05:17:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14058 DF PROTO=TCP SPT=35424 DPT=9882 SEQ=597726478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4C6DB0000000001030307) Oct 14 05:17:05 localhost python3.9[111315]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:17:05 localhost systemd[1]: Reloading. Oct 14 05:17:05 localhost systemd-rc-local-generator[111340]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:17:05 localhost systemd-sysv-generator[111344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:17:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:17:06 localhost python3.9[111445]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:17:06 localhost systemd[1]: Reloading. Oct 14 05:17:06 localhost systemd-rc-local-generator[111475]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:17:06 localhost systemd-sysv-generator[111480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:17:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:17:07 localhost systemd[1]: Stopping nova_virtsecretd container... Oct 14 05:17:07 localhost systemd[1]: tmp-crun.vFu4hj.mount: Deactivated successfully. Oct 14 05:17:07 localhost systemd[1]: libpod-02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf.scope: Deactivated successfully. Oct 14 05:17:07 localhost podman[111487]: 2025-10-14 09:17:07.112438677 +0000 UTC m=+0.083599241 container died 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, distribution-scope=public, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:56:59, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=) Oct 14 05:17:07 localhost podman[111487]: 2025-10-14 09:17:07.147044039 +0000 UTC m=+0.118204573 container cleanup 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=2, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtsecretd, build-date=2025-07-21T14:56:59, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, architecture=x86_64, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:17:07 localhost podman[111487]: nova_virtsecretd Oct 14 05:17:07 localhost podman[111502]: 2025-10-14 09:17:07.199065981 +0000 UTC m=+0.072767985 container cleanup 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=2, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, build-date=2025-07-21T14:56:59) Oct 14 05:17:07 localhost systemd[1]: libpod-conmon-02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf.scope: Deactivated successfully. Oct 14 05:17:07 localhost podman[111532]: error opening file `/run/crun/02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf/status`: No such file or directory Oct 14 05:17:07 localhost podman[111519]: 2025-10-14 09:17:07.308295025 +0000 UTC m=+0.071896348 container cleanup 02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, build-date=2025-07-21T14:56:59, container_name=nova_virtsecretd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container) Oct 14 05:17:07 localhost podman[111519]: nova_virtsecretd Oct 14 05:17:07 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Oct 14 05:17:07 localhost systemd[1]: Stopped nova_virtsecretd container. Oct 14 05:17:08 localhost python3.9[111625]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:17:08 localhost systemd[1]: var-lib-containers-storage-overlay-14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8-merged.mount: Deactivated successfully. Oct 14 05:17:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02380eea0a197064ac4a61c7601e9a90a4b04b9a25c36e2d784cf28785ae6eaf-userdata-shm.mount: Deactivated successfully. Oct 14 05:17:08 localhost systemd[1]: Reloading. Oct 14 05:17:08 localhost systemd-rc-local-generator[111655]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:17:08 localhost systemd-sysv-generator[111658]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:17:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:17:08 localhost systemd[1]: Stopping nova_virtstoraged container... Oct 14 05:17:08 localhost systemd[1]: tmp-crun.nGCX6N.mount: Deactivated successfully. Oct 14 05:17:08 localhost systemd[1]: libpod-642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee.scope: Deactivated successfully. Oct 14 05:17:08 localhost podman[111666]: 2025-10-14 09:17:08.615749831 +0000 UTC m=+0.114371933 container died 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T14:56:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:17:08 localhost podman[111666]: 2025-10-14 09:17:08.658530207 +0000 UTC m=+0.157152299 container cleanup 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, version=17.1.9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:17:08 localhost podman[111666]: nova_virtstoraged Oct 14 05:17:08 localhost podman[111680]: 2025-10-14 09:17:08.687138743 +0000 UTC m=+0.064367495 container cleanup 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T14:56:59, container_name=nova_virtstoraged, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, version=17.1.9, release=2, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt) Oct 14 05:17:08 localhost systemd[1]: libpod-conmon-642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee.scope: Deactivated successfully. Oct 14 05:17:08 localhost podman[111709]: error opening file `/run/crun/642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee/status`: No such file or directory Oct 14 05:17:08 localhost podman[111697]: 2025-10-14 09:17:08.773814009 +0000 UTC m=+0.057071319 container cleanup 642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4d186a6228facd5bcddf9bcc145eb470'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, release=2, container_name=nova_virtstoraged, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:17:08 localhost podman[111697]: nova_virtstoraged Oct 14 05:17:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15448 DF PROTO=TCP SPT=49638 DPT=9105 SEQ=2185358872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4D3EE0000000001030307) Oct 14 05:17:08 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Oct 14 05:17:08 localhost systemd[1]: Stopped nova_virtstoraged container. Oct 14 05:17:09 localhost systemd[1]: tmp-crun.FpytY0.mount: Deactivated successfully. Oct 14 05:17:09 localhost systemd[1]: var-lib-containers-storage-overlay-a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4-merged.mount: Deactivated successfully. Oct 14 05:17:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-642ddc0adda99f37b6ecfbddc75eb67ba42a08fdfa9f5ce9223f91ce82a5e0ee-userdata-shm.mount: Deactivated successfully. Oct 14 05:17:09 localhost python3.9[111802]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:17:09 localhost systemd[1]: Reloading. Oct 14 05:17:09 localhost systemd-rc-local-generator[111827]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:17:09 localhost systemd-sysv-generator[111833]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:17:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15449 DF PROTO=TCP SPT=49638 DPT=9105 SEQ=2185358872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4D7DA0000000001030307) Oct 14 05:17:09 localhost systemd[1]: Stopping ovn_controller container... Oct 14 05:17:10 localhost systemd[1]: libpod-a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.scope: Deactivated successfully. Oct 14 05:17:10 localhost systemd[1]: libpod-a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.scope: Consumed 2.696s CPU time. Oct 14 05:17:10 localhost podman[111842]: 2025-10-14 09:17:10.083535185 +0000 UTC m=+0.118965036 container stop a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Oct 14 05:17:10 localhost podman[111842]: 2025-10-14 09:17:10.116838787 +0000 UTC m=+0.152268618 container died a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, release=1, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1) Oct 14 05:17:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.timer: Deactivated successfully. Oct 14 05:17:10 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5. Oct 14 05:17:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed to open /run/systemd/transient/a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: No such file or directory Oct 14 05:17:10 localhost systemd[1]: tmp-crun.9VGxlf.mount: Deactivated successfully. Oct 14 05:17:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5-userdata-shm.mount: Deactivated successfully. Oct 14 05:17:10 localhost systemd[1]: var-lib-containers-storage-overlay-982298a30930777b3d22fc278491644ea40b8cb3cf87000aab15fb7785d6006b-merged.mount: Deactivated successfully. Oct 14 05:17:10 localhost podman[111842]: 2025-10-14 09:17:10.173387549 +0000 UTC m=+0.208817380 container cleanup a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.9, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible) Oct 14 05:17:10 localhost podman[111842]: ovn_controller Oct 14 05:17:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.timer: Failed to open /run/systemd/transient/a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.timer: No such file or directory Oct 14 05:17:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed to open /run/systemd/transient/a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: No such file or directory Oct 14 05:17:10 localhost podman[111855]: 2025-10-14 09:17:10.2279581 +0000 UTC m=+0.123482597 container cleanup a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12) Oct 14 05:17:10 localhost systemd[1]: libpod-conmon-a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.scope: Deactivated successfully. Oct 14 05:17:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.timer: Failed to open /run/systemd/transient/a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.timer: No such file or directory Oct 14 05:17:10 localhost systemd[1]: a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: Failed to open /run/systemd/transient/a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5.service: No such file or directory Oct 14 05:17:10 localhost podman[111866]: 2025-10-14 09:17:10.331619431 +0000 UTC m=+0.062457955 container cleanup a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, release=1, version=17.1.9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Oct 14 05:17:10 localhost podman[111866]: ovn_controller Oct 14 05:17:10 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Oct 14 05:17:10 localhost systemd[1]: Stopped ovn_controller container. Oct 14 05:17:11 localhost python3.9[111970]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:17:11 localhost systemd[1]: Reloading. Oct 14 05:17:11 localhost systemd-sysv-generator[112003]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:17:11 localhost systemd-rc-local-generator[111999]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:17:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:17:11 localhost systemd[1]: Stopping ovn_metadata_agent container... Oct 14 05:17:11 localhost systemd[1]: tmp-crun.lvTV9M.mount: Deactivated successfully. Oct 14 05:17:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15450 DF PROTO=TCP SPT=49638 DPT=9105 SEQ=2185358872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4DFDA0000000001030307) Oct 14 05:17:11 localhost systemd[1]: libpod-7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.scope: Deactivated successfully. Oct 14 05:17:11 localhost systemd[1]: libpod-7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.scope: Consumed 11.992s CPU time. Oct 14 05:17:11 localhost podman[112011]: 2025-10-14 09:17:11.901062764 +0000 UTC m=+0.377618650 container died 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., release=1, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Oct 14 05:17:11 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.timer: Deactivated successfully. Oct 14 05:17:11 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e. Oct 14 05:17:11 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed to open /run/systemd/transient/7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: No such file or directory Oct 14 05:17:11 localhost podman[112011]: 2025-10-14 09:17:11.974046446 +0000 UTC m=+0.450602272 container cleanup 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 05:17:11 localhost podman[112011]: ovn_metadata_agent Oct 14 05:17:12 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.timer: Failed to open /run/systemd/transient/7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.timer: No such file or directory Oct 14 05:17:12 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed to open /run/systemd/transient/7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: No such file or directory Oct 14 05:17:12 localhost podman[112024]: 2025-10-14 09:17:12.062811296 +0000 UTC m=+0.147887783 container cleanup 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent) Oct 14 05:17:12 localhost systemd[1]: var-lib-containers-storage-overlay-47662028da5721c0c74410535dffb8022fc35a3f39034c6ae555d30a14cd4135-merged.mount: Deactivated successfully. Oct 14 05:17:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e-userdata-shm.mount: Deactivated successfully. Oct 14 05:17:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15451 DF PROTO=TCP SPT=49638 DPT=9105 SEQ=2185358872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4EF9A0000000001030307) Oct 14 05:17:17 localhost podman[112144]: 2025-10-14 09:17:17.096277989 +0000 UTC m=+0.103448696 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, version=7, RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 05:17:17 localhost podman[112144]: 2025-10-14 09:17:17.228293449 +0000 UTC m=+0.235464196 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, distribution-scope=public, name=rhceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.tags=rhceph ceph) Oct 14 05:17:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40427 DF PROTO=TCP SPT=49136 DPT=9101 SEQ=3900930622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF4F89A0000000001030307) Oct 14 05:17:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41617 DF PROTO=TCP SPT=37018 DPT=9102 SEQ=2442899288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF50D750000000001030307) Oct 14 05:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41618 DF PROTO=TCP SPT=37018 DPT=9102 SEQ=2442899288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5119A0000000001030307) Oct 14 05:17:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19529 DF PROTO=TCP SPT=42884 DPT=9882 SEQ=2534537159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5203A0000000001030307) Oct 14 05:17:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41620 DF PROTO=TCP SPT=37018 DPT=9102 SEQ=2442899288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5295A0000000001030307) Oct 14 05:17:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19532 DF PROTO=TCP SPT=42884 DPT=9882 SEQ=2534537159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF53C1B0000000001030307) Oct 14 05:17:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60226 DF PROTO=TCP SPT=46138 DPT=9105 SEQ=1096624180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5491E0000000001030307) Oct 14 05:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60227 DF PROTO=TCP SPT=46138 DPT=9105 SEQ=1096624180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF54D1A0000000001030307) Oct 14 05:17:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60228 DF PROTO=TCP SPT=46138 DPT=9105 SEQ=1096624180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5551B0000000001030307) Oct 14 05:17:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60229 DF PROTO=TCP SPT=46138 DPT=9105 SEQ=1096624180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF564DA0000000001030307) Oct 14 05:17:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58322 DF PROTO=TCP SPT=58902 DPT=9101 SEQ=1936226091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF56D9A0000000001030307) Oct 14 05:17:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59867 DF PROTO=TCP SPT=48198 DPT=9102 SEQ=2352394263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF582A50000000001030307) Oct 14 05:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59868 DF PROTO=TCP SPT=48198 DPT=9102 SEQ=2352394263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5869B0000000001030307) Oct 14 05:17:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26280 DF PROTO=TCP SPT=55754 DPT=9882 SEQ=3644752773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5956B0000000001030307) Oct 14 05:18:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59870 DF PROTO=TCP SPT=48198 DPT=9102 SEQ=2352394263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF59E5A0000000001030307) Oct 14 05:18:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26283 DF PROTO=TCP SPT=55754 DPT=9882 SEQ=3644752773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5B11A0000000001030307) Oct 14 05:18:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28849 DF PROTO=TCP SPT=34906 DPT=9105 SEQ=4173587401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5BE4E0000000001030307) Oct 14 05:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28850 DF PROTO=TCP SPT=34906 DPT=9105 SEQ=4173587401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5C25A0000000001030307) Oct 14 05:18:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28851 DF PROTO=TCP SPT=34906 DPT=9105 SEQ=4173587401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5CA5B0000000001030307) Oct 14 05:18:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28852 DF PROTO=TCP SPT=34906 DPT=9105 SEQ=4173587401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5DA1A0000000001030307) Oct 14 05:18:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16589 DF PROTO=TCP SPT=36096 DPT=9101 SEQ=422965733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5E2DA0000000001030307) Oct 14 05:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22385 DF PROTO=TCP SPT=39924 DPT=9102 SEQ=1310416538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5F7D50000000001030307) Oct 14 05:18:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22386 DF PROTO=TCP SPT=39924 DPT=9102 SEQ=1310416538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF5FBDA0000000001030307) Oct 14 05:18:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37333 DF PROTO=TCP SPT=47646 DPT=9882 SEQ=3582073061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF60A9B0000000001030307) Oct 14 05:18:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22388 DF PROTO=TCP SPT=39924 DPT=9102 SEQ=1310416538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6139B0000000001030307) Oct 14 05:18:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37336 DF PROTO=TCP SPT=47646 DPT=9882 SEQ=3582073061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6265A0000000001030307) Oct 14 05:18:36 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Oct 14 05:18:36 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 72063 (conmon) with signal SIGKILL. Oct 14 05:18:36 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Oct 14 05:18:36 localhost systemd[1]: libpod-conmon-7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.scope: Deactivated successfully. Oct 14 05:18:36 localhost podman[112378]: error opening file `/run/crun/7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e/status`: No such file or directory Oct 14 05:18:36 localhost systemd[1]: tmp-crun.Q9sO3S.mount: Deactivated successfully. Oct 14 05:18:36 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.timer: Failed to open /run/systemd/transient/7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.timer: No such file or directory Oct 14 05:18:36 localhost systemd[1]: 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: Failed to open /run/systemd/transient/7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e.service: No such file or directory Oct 14 05:18:36 localhost podman[112366]: 2025-10-14 09:18:36.241403833 +0000 UTC m=+0.080659910 container cleanup 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20250721.1, release=1, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1) Oct 14 05:18:36 localhost podman[112366]: ovn_metadata_agent Oct 14 05:18:36 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Oct 14 05:18:36 localhost systemd[1]: Stopped ovn_metadata_agent container. Oct 14 05:18:37 localhost python3.9[112472]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:18:37 localhost systemd[1]: Reloading. Oct 14 05:18:37 localhost systemd-sysv-generator[112502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:18:37 localhost systemd-rc-local-generator[112495]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:18:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:18:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5904 DF PROTO=TCP SPT=57242 DPT=9105 SEQ=2729141939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6337E0000000001030307) Oct 14 05:18:39 localhost python3.9[112601]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5905 DF PROTO=TCP SPT=57242 DPT=9105 SEQ=2729141939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6379A0000000001030307) Oct 14 05:18:40 localhost python3.9[112693]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:40 localhost python3.9[112785]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:41 localhost python3.9[112877]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5906 DF PROTO=TCP SPT=57242 DPT=9105 SEQ=2729141939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF63F9B0000000001030307) Oct 14 05:18:41 localhost python3.9[112969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:42 localhost python3.9[113061]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:43 localhost python3.9[113153]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:43 localhost python3.9[113245]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:44 localhost python3.9[113337]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:45 localhost python3.9[113429]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:45 localhost python3.9[113521]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5907 DF PROTO=TCP SPT=57242 DPT=9105 SEQ=2729141939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF64F5A0000000001030307) Oct 14 05:18:46 localhost python3.9[113613]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:47 localhost python3.9[113705]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:47 localhost python3.9[113797]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30168 DF PROTO=TCP SPT=52562 DPT=9101 SEQ=63393729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6581B0000000001030307) Oct 14 05:18:48 localhost python3.9[113889]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:49 localhost python3.9[113981]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:49 localhost python3.9[114073]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:50 localhost python3.9[114165]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:50 localhost python3.9[114257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:51 localhost python3.9[114349]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:52 localhost python3.9[114441]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:53 localhost python3.9[114533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43548 DF PROTO=TCP SPT=49156 DPT=9102 SEQ=3677283698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF66D050000000001030307) Oct 14 05:18:53 localhost python3.9[114625]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:54 localhost python3.9[114717]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43549 DF PROTO=TCP SPT=49156 DPT=9102 SEQ=3677283698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6711A0000000001030307) Oct 14 05:18:55 localhost python3.9[114809]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:55 localhost python3.9[114901]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:56 localhost python3.9[114993]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:57 localhost python3.9[115085]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:57 localhost python3.9[115177]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40376 DF PROTO=TCP SPT=48942 DPT=9882 SEQ=4054627429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF67FCC0000000001030307) Oct 14 05:18:58 localhost python3.9[115269]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:59 localhost python3.9[115361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:18:59 localhost python3.9[115453]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:00 localhost python3.9[115545]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43551 DF PROTO=TCP SPT=49156 DPT=9102 SEQ=3677283698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF688DA0000000001030307) Oct 14 05:19:01 localhost python3.9[115637]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:01 localhost python3.9[115729]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:02 localhost python3.9[115821]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:03 localhost python3.9[115913]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:03 localhost python3.9[116005]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:04 localhost python3.9[116097]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:05 localhost python3.9[116189]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40379 DF PROTO=TCP SPT=48942 DPT=9882 SEQ=4054627429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF69B9A0000000001030307) Oct 14 05:19:05 localhost python3.9[116281]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:06 localhost python3.9[116373]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:07 localhost python3.9[116465]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:08 localhost python3.9[116557]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:19:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26809 DF PROTO=TCP SPT=45818 DPT=9105 SEQ=4119446714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6A8AE0000000001030307) Oct 14 05:19:09 localhost python3.9[116649]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:19:09 localhost systemd[1]: Reloading. Oct 14 05:19:09 localhost systemd-rc-local-generator[116673]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:19:09 localhost systemd-sysv-generator[116677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:19:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26810 DF PROTO=TCP SPT=45818 DPT=9105 SEQ=4119446714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6AC9B0000000001030307) Oct 14 05:19:10 localhost python3.9[116777]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:10 localhost python3.9[116870]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:11 localhost python3.9[116963]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26811 DF PROTO=TCP SPT=45818 DPT=9105 SEQ=4119446714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6B49B0000000001030307) Oct 14 05:19:12 localhost python3.9[117056]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:12 localhost python3.9[117149]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:13 localhost python3.9[117242]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:14 localhost python3.9[117335]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:14 localhost python3.9[117428]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:15 localhost python3.9[117521]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26812 DF PROTO=TCP SPT=45818 DPT=9105 SEQ=4119446714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6C45B0000000001030307) Oct 14 05:19:16 localhost python3.9[117614]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:16 localhost python3.9[117707]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:17 localhost python3.9[117800]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:17 localhost python3.9[117893]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49886 DF PROTO=TCP SPT=47106 DPT=9101 SEQ=971360917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6CD5A0000000001030307) Oct 14 05:19:18 localhost python3.9[117986]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:19 localhost python3.9[118079]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:19 localhost python3.9[118172]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:20 localhost python3.9[118265]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:21 localhost python3.9[118388]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:21 localhost python3.9[118512]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:22 localhost python3.9[118620]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:23 localhost python3.9[118713]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28018 DF PROTO=TCP SPT=38274 DPT=9102 SEQ=4235933627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6E2350000000001030307) Oct 14 05:19:24 localhost systemd[1]: session-37.scope: Deactivated successfully. Oct 14 05:19:24 localhost systemd[1]: session-37.scope: Consumed 51.291s CPU time. Oct 14 05:19:24 localhost systemd-logind[760]: Session 37 logged out. Waiting for processes to exit. Oct 14 05:19:24 localhost systemd-logind[760]: Removed session 37. Oct 14 05:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28019 DF PROTO=TCP SPT=38274 DPT=9102 SEQ=4235933627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6E65B0000000001030307) Oct 14 05:19:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58303 DF PROTO=TCP SPT=57804 DPT=9882 SEQ=4209456313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6F4FB0000000001030307) Oct 14 05:19:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28021 DF PROTO=TCP SPT=38274 DPT=9102 SEQ=4235933627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF6FE1B0000000001030307) Oct 14 05:19:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58306 DF PROTO=TCP SPT=57804 DPT=9882 SEQ=4209456313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF710DA0000000001030307) Oct 14 05:19:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1172 DF PROTO=TCP SPT=57590 DPT=9105 SEQ=5492057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF71DDE0000000001030307) Oct 14 05:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1173 DF PROTO=TCP SPT=57590 DPT=9105 SEQ=5492057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF721DA0000000001030307) Oct 14 05:19:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1174 DF PROTO=TCP SPT=57590 DPT=9105 SEQ=5492057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF729DA0000000001030307) Oct 14 05:19:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1175 DF PROTO=TCP SPT=57590 DPT=9105 SEQ=5492057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7399B0000000001030307) Oct 14 05:19:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7843 DF PROTO=TCP SPT=44754 DPT=9101 SEQ=3945474460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7425A0000000001030307) Oct 14 05:19:48 localhost sshd[118729]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:19:49 localhost systemd-logind[760]: New session 38 of user zuul. Oct 14 05:19:49 localhost systemd[1]: Started Session 38 of User zuul. Oct 14 05:19:49 localhost python3.9[118822]: ansible-ansible.legacy.ping Invoked with data=pong Oct 14 05:19:51 localhost python3.9[118926]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:19:52 localhost python3.9[119018]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:19:53 localhost python3.9[119111]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:19:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48882 DF PROTO=TCP SPT=44158 DPT=9102 SEQ=338936325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF757640000000001030307) Oct 14 05:19:53 localhost python3.9[119203]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48883 DF PROTO=TCP SPT=44158 DPT=9102 SEQ=338936325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF75B5A0000000001030307) Oct 14 05:19:54 localhost python3.9[119295]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:19:55 localhost python3.9[119368]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433594.152209-179-100325007257187/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:19:56 localhost python3.9[119460]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:19:57 localhost python3.9[119556]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:19:58 localhost python3.9[119646]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:19:58 localhost network[119663]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:19:58 localhost network[119664]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:19:58 localhost network[119665]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:19:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1946 DF PROTO=TCP SPT=41926 DPT=9882 SEQ=903280494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF76A2B0000000001030307) Oct 14 05:19:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:20:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48885 DF PROTO=TCP SPT=44158 DPT=9102 SEQ=338936325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7731B0000000001030307) Oct 14 05:20:02 localhost python3.9[119862]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:20:03 localhost python3.9[119952]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:20:04 localhost python3.9[120048]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:20:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1949 DF PROTO=TCP SPT=41926 DPT=9882 SEQ=903280494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF785DA0000000001030307) Oct 14 05:20:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41428 DF PROTO=TCP SPT=60150 DPT=9105 SEQ=2597727512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7930E0000000001030307) Oct 14 05:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41429 DF PROTO=TCP SPT=60150 DPT=9105 SEQ=2597727512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7971B0000000001030307) Oct 14 05:20:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41430 DF PROTO=TCP SPT=60150 DPT=9105 SEQ=2597727512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF79F1A0000000001030307) Oct 14 05:20:13 localhost systemd[1]: Stopping OpenSSH server daemon... Oct 14 05:20:13 localhost systemd[1]: sshd.service: Deactivated successfully. Oct 14 05:20:13 localhost systemd[1]: Stopped OpenSSH server daemon. Oct 14 05:20:13 localhost systemd[1]: Stopped target sshd-keygen.target. Oct 14 05:20:13 localhost systemd[1]: Stopping sshd-keygen.target... Oct 14 05:20:13 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:20:13 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:20:13 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:20:13 localhost systemd[1]: Reached target sshd-keygen.target. Oct 14 05:20:13 localhost systemd[1]: Starting OpenSSH server daemon... Oct 14 05:20:13 localhost sshd[120091]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:20:13 localhost systemd[1]: Started OpenSSH server daemon. Oct 14 05:20:14 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 05:20:14 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 05:20:14 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 05:20:14 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 05:20:14 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 05:20:14 localhost systemd[1]: run-r302e2612f836448e9d3317a8a7054a43.service: Deactivated successfully. Oct 14 05:20:14 localhost systemd[1]: run-r2f338c9d9be644d0bccfb246339384a8.service: Deactivated successfully. Oct 14 05:20:15 localhost systemd[1]: Stopping OpenSSH server daemon... Oct 14 05:20:15 localhost systemd[1]: sshd.service: Deactivated successfully. Oct 14 05:20:15 localhost systemd[1]: Stopped OpenSSH server daemon. Oct 14 05:20:15 localhost systemd[1]: Stopped target sshd-keygen.target. Oct 14 05:20:15 localhost systemd[1]: Stopping sshd-keygen.target... Oct 14 05:20:15 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:20:15 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:20:15 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:20:15 localhost systemd[1]: Reached target sshd-keygen.target. Oct 14 05:20:15 localhost systemd[1]: Starting OpenSSH server daemon... Oct 14 05:20:15 localhost sshd[120263]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:20:15 localhost systemd[1]: Started OpenSSH server daemon. Oct 14 05:20:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41431 DF PROTO=TCP SPT=60150 DPT=9105 SEQ=2597727512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7AEDB0000000001030307) Oct 14 05:20:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36350 DF PROTO=TCP SPT=48140 DPT=9101 SEQ=802877454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7B79A0000000001030307) Oct 14 05:20:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2868 DF PROTO=TCP SPT=51724 DPT=9102 SEQ=986371911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7CC940000000001030307) Oct 14 05:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2869 DF PROTO=TCP SPT=51724 DPT=9102 SEQ=986371911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7D09A0000000001030307) Oct 14 05:20:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23865 DF PROTO=TCP SPT=60258 DPT=9882 SEQ=898040061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7DF5B0000000001030307) Oct 14 05:20:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2871 DF PROTO=TCP SPT=51724 DPT=9102 SEQ=986371911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7E85A0000000001030307) Oct 14 05:20:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23868 DF PROTO=TCP SPT=60258 DPT=9882 SEQ=898040061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF7FB1A0000000001030307) Oct 14 05:20:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29320 DF PROTO=TCP SPT=40928 DPT=9105 SEQ=3121325009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8083F0000000001030307) Oct 14 05:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29321 DF PROTO=TCP SPT=40928 DPT=9105 SEQ=3121325009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF80C5A0000000001030307) Oct 14 05:20:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29322 DF PROTO=TCP SPT=40928 DPT=9105 SEQ=3121325009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8145B0000000001030307) Oct 14 05:20:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29323 DF PROTO=TCP SPT=40928 DPT=9105 SEQ=3121325009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8241A0000000001030307) Oct 14 05:20:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15206 DF PROTO=TCP SPT=46812 DPT=9101 SEQ=861388729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF82CDA0000000001030307) Oct 14 05:20:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49990 DF PROTO=TCP SPT=43702 DPT=9102 SEQ=818557157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF841C40000000001030307) Oct 14 05:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49991 DF PROTO=TCP SPT=43702 DPT=9102 SEQ=818557157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF845DB0000000001030307) Oct 14 05:20:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60668 DF PROTO=TCP SPT=51860 DPT=9882 SEQ=3188134762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8548B0000000001030307) Oct 14 05:21:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49993 DF PROTO=TCP SPT=43702 DPT=9102 SEQ=818557157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF85D9A0000000001030307) Oct 14 05:21:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60671 DF PROTO=TCP SPT=51860 DPT=9882 SEQ=3188134762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF870720000000001030307) Oct 14 05:21:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47154 DF PROTO=TCP SPT=55198 DPT=9105 SEQ=1014913829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF87D6F0000000001030307) Oct 14 05:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47155 DF PROTO=TCP SPT=55198 DPT=9105 SEQ=1014913829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8815A0000000001030307) Oct 14 05:21:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47156 DF PROTO=TCP SPT=55198 DPT=9105 SEQ=1014913829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8895A0000000001030307) Oct 14 05:21:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47157 DF PROTO=TCP SPT=55198 DPT=9105 SEQ=1014913829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8991A0000000001030307) Oct 14 05:21:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8760 DF PROTO=TCP SPT=47642 DPT=9101 SEQ=1413358377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8A21A0000000001030307) Oct 14 05:21:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42862 DF PROTO=TCP SPT=39842 DPT=9102 SEQ=3865189853 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8B6F50000000001030307) Oct 14 05:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42863 DF PROTO=TCP SPT=39842 DPT=9102 SEQ=3865189853 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8BB1A0000000001030307) Oct 14 05:21:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7358 DF PROTO=TCP SPT=41794 DPT=9882 SEQ=861095755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8C9BB0000000001030307) Oct 14 05:21:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42865 DF PROTO=TCP SPT=39842 DPT=9102 SEQ=3865189853 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8D2DA0000000001030307) Oct 14 05:21:31 localhost kernel: SELinux: Converting 2753 SID table entries... Oct 14 05:21:31 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:21:31 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:21:31 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:21:31 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:21:31 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:21:31 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:21:31 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:21:33 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=17 res=1 Oct 14 05:21:33 localhost python3.9[120975]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:21:34 localhost python3.9[121067]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:21:34 localhost python3.9[121140]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433693.7553673-402-15278195578070/.source.fact _original_basename=.6u2_8c_a follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:21:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7361 DF PROTO=TCP SPT=41794 DPT=9882 SEQ=861095755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8E59B0000000001030307) Oct 14 05:21:35 localhost python3.9[121230]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:21:36 localhost python3.9[121328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:21:37 localhost python3.9[121382]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:21:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5784 DF PROTO=TCP SPT=45110 DPT=9105 SEQ=3607197674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8F29F0000000001030307) Oct 14 05:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5785 DF PROTO=TCP SPT=45110 DPT=9105 SEQ=3607197674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8F69B0000000001030307) Oct 14 05:21:41 localhost systemd[1]: Reloading. Oct 14 05:21:41 localhost systemd-rc-local-generator[121415]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:21:41 localhost systemd-sysv-generator[121419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:21:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:21:41 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 05:21:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5786 DF PROTO=TCP SPT=45110 DPT=9105 SEQ=3607197674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF8FE9A0000000001030307) Oct 14 05:21:43 localhost python3.9[121521]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:21:45 localhost python3.9[121760]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Oct 14 05:21:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5787 DF PROTO=TCP SPT=45110 DPT=9105 SEQ=3607197674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF90E5B0000000001030307) Oct 14 05:21:46 localhost python3.9[121852]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Oct 14 05:21:47 localhost python3.9[121945]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:21:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=379 DF PROTO=TCP SPT=41400 DPT=9101 SEQ=3607891205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9171A0000000001030307) Oct 14 05:21:48 localhost python3.9[122037]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Oct 14 05:21:49 localhost python3.9[122129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:21:50 localhost python3.9[122221]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:21:51 localhost python3.9[122294]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760433710.1925461-726-8356035221698/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:21:52 localhost python3.9[122386]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Oct 14 05:21:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3161 DF PROTO=TCP SPT=55524 DPT=9102 SEQ=2681845826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF92C250000000001030307) Oct 14 05:21:53 localhost python3.9[122479]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Oct 14 05:21:54 localhost python3.9[122572]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Oct 14 05:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3162 DF PROTO=TCP SPT=55524 DPT=9102 SEQ=2681845826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9301B0000000001030307) Oct 14 05:21:55 localhost python3.9[122670]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Oct 14 05:21:56 localhost python3.9[122762]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:21:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5258 DF PROTO=TCP SPT=60204 DPT=9882 SEQ=1951528308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF93EEB0000000001030307) Oct 14 05:22:00 localhost python3.9[122856]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3164 DF PROTO=TCP SPT=55524 DPT=9102 SEQ=2681845826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF947DA0000000001030307) Oct 14 05:22:01 localhost python3.9[122948]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:22:01 localhost python3.9[123021]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433720.854303-968-65831856141658/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:03 localhost python3.9[123113]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:22:03 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 14 05:22:03 localhost systemd[1]: Stopped Load Kernel Modules. Oct 14 05:22:03 localhost systemd[1]: Stopping Load Kernel Modules... Oct 14 05:22:03 localhost systemd[1]: Starting Load Kernel Modules... Oct 14 05:22:03 localhost systemd-modules-load[123117]: Module 'msr' is built in Oct 14 05:22:03 localhost systemd[1]: Finished Load Kernel Modules. Oct 14 05:22:03 localhost python3.9[123211]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:22:04 localhost python3.9[123284]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433723.3763633-1038-227390343308191/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5261 DF PROTO=TCP SPT=60204 DPT=9882 SEQ=1951528308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF95A9A0000000001030307) Oct 14 05:22:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42317 DF PROTO=TCP SPT=52514 DPT=9105 SEQ=1014687020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF967CE0000000001030307) Oct 14 05:22:09 localhost python3.9[123376]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42318 DF PROTO=TCP SPT=52514 DPT=9105 SEQ=1014687020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF96BDA0000000001030307) Oct 14 05:22:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42319 DF PROTO=TCP SPT=52514 DPT=9105 SEQ=1014687020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF973DA0000000001030307) Oct 14 05:22:13 localhost python3.9[123469]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:22:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42320 DF PROTO=TCP SPT=52514 DPT=9105 SEQ=1014687020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9839B0000000001030307) Oct 14 05:22:16 localhost python3.9[123561]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Oct 14 05:22:17 localhost python3.9[123651]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:22:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33455 DF PROTO=TCP SPT=35540 DPT=9101 SEQ=2464613413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF98C5A0000000001030307) Oct 14 05:22:18 localhost python3.9[123743]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:22:19 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Oct 14 05:22:19 localhost systemd[1]: tuned.service: Deactivated successfully. Oct 14 05:22:19 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Oct 14 05:22:19 localhost systemd[1]: tuned.service: Consumed 2.062s CPU time, no IO. Oct 14 05:22:19 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Oct 14 05:22:20 localhost systemd[1]: Started Dynamic System Tuning Daemon. Oct 14 05:22:21 localhost python3.9[123845]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Oct 14 05:22:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33704 DF PROTO=TCP SPT=45886 DPT=9102 SEQ=3878813208 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9A1540000000001030307) Oct 14 05:22:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33705 DF PROTO=TCP SPT=45886 DPT=9102 SEQ=3878813208 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9A55B0000000001030307) Oct 14 05:22:25 localhost python3.9[123937]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:22:25 localhost systemd[1]: Reloading. Oct 14 05:22:25 localhost systemd-rc-local-generator[123962]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:22:25 localhost systemd-sysv-generator[123967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:22:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:22:26 localhost python3.9[124066]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:22:26 localhost systemd[1]: Reloading. Oct 14 05:22:26 localhost systemd-rc-local-generator[124091]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:22:26 localhost systemd-sysv-generator[124096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:22:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:22:26 localhost systemd[1]: Starting dnf makecache... Oct 14 05:22:26 localhost dnf[124104]: Updating Subscription Management repositories. Oct 14 05:22:28 localhost python3.9[124196]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:22:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4290 DF PROTO=TCP SPT=42406 DPT=9882 SEQ=2269816687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9B41B0000000001030307) Oct 14 05:22:28 localhost dnf[124104]: Metadata cache refreshed recently. Oct 14 05:22:28 localhost python3.9[124319]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:22:28 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Oct 14 05:22:29 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Oct 14 05:22:29 localhost systemd[1]: Finished dnf makecache. Oct 14 05:22:29 localhost systemd[1]: dnf-makecache.service: Consumed 2.282s CPU time. Oct 14 05:22:29 localhost python3.9[124445]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:22:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33707 DF PROTO=TCP SPT=45886 DPT=9102 SEQ=3878813208 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9BD1B0000000001030307) Oct 14 05:22:31 localhost python3.9[124594]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:22:32 localhost python3.9[124687]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:22:32 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 14 05:22:32 localhost systemd[1]: Stopped Apply Kernel Variables. Oct 14 05:22:32 localhost systemd[1]: Stopping Apply Kernel Variables... Oct 14 05:22:32 localhost systemd[1]: Starting Apply Kernel Variables... Oct 14 05:22:32 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Oct 14 05:22:32 localhost systemd[1]: Finished Apply Kernel Variables. Oct 14 05:22:32 localhost systemd[1]: session-38.scope: Deactivated successfully. Oct 14 05:22:32 localhost systemd[1]: session-38.scope: Consumed 2min 3.520s CPU time. Oct 14 05:22:32 localhost systemd-logind[760]: Session 38 logged out. Waiting for processes to exit. Oct 14 05:22:32 localhost systemd-logind[760]: Removed session 38. Oct 14 05:22:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4293 DF PROTO=TCP SPT=42406 DPT=9882 SEQ=2269816687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9CFDA0000000001030307) Oct 14 05:22:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44607 DF PROTO=TCP SPT=37688 DPT=9105 SEQ=426342894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9DCFF0000000001030307) Oct 14 05:22:38 localhost sshd[124722]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:22:39 localhost systemd-logind[760]: New session 39 of user zuul. Oct 14 05:22:39 localhost systemd[1]: Started Session 39 of User zuul. Oct 14 05:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44608 DF PROTO=TCP SPT=37688 DPT=9105 SEQ=426342894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9E11A0000000001030307) Oct 14 05:22:40 localhost python3.9[124815]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:22:41 localhost sshd[124893]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:22:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:22:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4960 writes, 22K keys, 4960 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4960 writes, 649 syncs, 7.64 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:22:41 localhost python3.9[124911]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:22:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44609 DF PROTO=TCP SPT=37688 DPT=9105 SEQ=426342894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9E91B0000000001030307) Oct 14 05:22:42 localhost python3.9[125007]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:22:43 localhost python3.9[125098]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:22:44 localhost python3.9[125194]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:22:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:22:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5551 writes, 24K keys, 5551 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5551 writes, 763 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:22:45 localhost python3.9[125248]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:22:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44610 DF PROTO=TCP SPT=37688 DPT=9105 SEQ=426342894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AF9F8DB0000000001030307) Oct 14 05:22:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51865 DF PROTO=TCP SPT=57742 DPT=9101 SEQ=332891473 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA019A0000000001030307) Oct 14 05:22:50 localhost python3.9[125342]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:22:52 localhost python3.9[125497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:22:53 localhost python3.9[125589]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:22:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40016 DF PROTO=TCP SPT=42374 DPT=9102 SEQ=2223877847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA16850000000001030307) Oct 14 05:22:54 localhost python3.9[125692]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40017 DF PROTO=TCP SPT=42374 DPT=9102 SEQ=2223877847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA1A9A0000000001030307) Oct 14 05:22:54 localhost python3.9[125740]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:22:55 localhost python3.9[125832]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:22:56 localhost python3.9[125905]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433774.9070182-326-228632273184224/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:57 localhost python3.9[125997]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:57 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Oct 14 05:22:57 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:22:57 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:22:57 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:22:57 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:22:57 localhost python3.9[126090]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41439 DF PROTO=TCP SPT=59108 DPT=9882 SEQ=2784570814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA294B0000000001030307) Oct 14 05:22:58 localhost python3.9[126182]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:58 localhost python3.9[126274]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:22:59 localhost python3.9[126364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:23:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40019 DF PROTO=TCP SPT=42374 DPT=9102 SEQ=2223877847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA325A0000000001030307) Oct 14 05:23:00 localhost python3.9[126458]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:04 localhost python3.9[126552]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41442 DF PROTO=TCP SPT=59108 DPT=9882 SEQ=2784570814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA451B0000000001030307) Oct 14 05:23:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14326 DF PROTO=TCP SPT=45718 DPT=9105 SEQ=2948685790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA522E0000000001030307) Oct 14 05:23:09 localhost python3.9[126646]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14327 DF PROTO=TCP SPT=45718 DPT=9105 SEQ=2948685790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA561A0000000001030307) Oct 14 05:23:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14328 DF PROTO=TCP SPT=45718 DPT=9105 SEQ=2948685790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA5E1A0000000001030307) Oct 14 05:23:13 localhost python3.9[126746]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14329 DF PROTO=TCP SPT=45718 DPT=9105 SEQ=2948685790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA6DDA0000000001030307) Oct 14 05:23:18 localhost python3.9[126840]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22000 DF PROTO=TCP SPT=56792 DPT=9101 SEQ=2521834883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA76E10000000001030307) Oct 14 05:23:22 localhost python3.9[126934]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37709 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=748276291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA8BB50000000001030307) Oct 14 05:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37710 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=748276291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA8FDA0000000001030307) Oct 14 05:23:26 localhost python3.9[127028]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:23:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1670 DF PROTO=TCP SPT=57536 DPT=9882 SEQ=719531788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFA9E7A0000000001030307) Oct 14 05:23:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37712 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=748276291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFAA79A0000000001030307) Oct 14 05:23:31 localhost sshd[127038]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:32 localhost sshd[127040]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:32 localhost sshd[127042]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:35 localhost podman[127201]: Oct 14 05:23:35 localhost podman[127201]: 2025-10-14 09:23:35.348996137 +0000 UTC m=+0.089453292 container create 7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_elgamal, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7) Oct 14 05:23:35 localhost podman[127201]: 2025-10-14 09:23:35.294735186 +0000 UTC m=+0.035192381 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 05:23:35 localhost systemd[1]: Started libpod-conmon-7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb.scope. Oct 14 05:23:35 localhost systemd[1]: Started libcrun container. Oct 14 05:23:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1673 DF PROTO=TCP SPT=57536 DPT=9882 SEQ=719531788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFABA5A0000000001030307) Oct 14 05:23:35 localhost podman[127201]: 2025-10-14 09:23:35.464693912 +0000 UTC m=+0.205151037 container init 7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_elgamal, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, distribution-scope=public, ceph=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Oct 14 05:23:35 localhost exciting_elgamal[127225]: 167 167 Oct 14 05:23:35 localhost systemd[1]: libpod-7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb.scope: Deactivated successfully. Oct 14 05:23:35 localhost podman[127201]: 2025-10-14 09:23:35.485792895 +0000 UTC m=+0.226250030 container start 7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_elgamal, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc.) Oct 14 05:23:35 localhost podman[127201]: 2025-10-14 09:23:35.485969131 +0000 UTC m=+0.226426266 container attach 7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_elgamal, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 05:23:35 localhost podman[127201]: 2025-10-14 09:23:35.487586911 +0000 UTC m=+0.228044066 container died 7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_elgamal, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7) Oct 14 05:23:35 localhost systemd[1]: tmp-crun.tUJArY.mount: Deactivated successfully. Oct 14 05:23:35 localhost podman[127231]: 2025-10-14 09:23:35.67734565 +0000 UTC m=+0.196622273 container remove 7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_elgamal, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, release=553, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main) Oct 14 05:23:35 localhost systemd[1]: libpod-conmon-7e670a1a5624e7b3d8ed033d687fb79e0653998f059e030f20d02a413f9287cb.scope: Deactivated successfully. Oct 14 05:23:35 localhost podman[127263]: Oct 14 05:23:35 localhost podman[127263]: 2025-10-14 09:23:35.857537542 +0000 UTC m=+0.067603665 container create 5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ardinghelli, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhceph, architecture=x86_64, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7) Oct 14 05:23:35 localhost systemd[1]: Started libpod-conmon-5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361.scope. Oct 14 05:23:35 localhost podman[127263]: 2025-10-14 09:23:35.820780683 +0000 UTC m=+0.030846816 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 05:23:35 localhost systemd[1]: Started libcrun container. Oct 14 05:23:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bf10646f5a374f5204c2d3b010a2d12c03485a4fd9becc8c50a248c7b70b3a6/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 05:23:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bf10646f5a374f5204c2d3b010a2d12c03485a4fd9becc8c50a248c7b70b3a6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 05:23:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bf10646f5a374f5204c2d3b010a2d12c03485a4fd9becc8c50a248c7b70b3a6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 05:23:35 localhost podman[127263]: 2025-10-14 09:23:35.968466299 +0000 UTC m=+0.178532442 container init 5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ardinghelli, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2025-09-24T08:57:55) Oct 14 05:23:35 localhost podman[127263]: 2025-10-14 09:23:35.991316227 +0000 UTC m=+0.201382330 container start 5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ardinghelli, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, version=7, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True) Oct 14 05:23:35 localhost podman[127263]: 2025-10-14 09:23:35.991567795 +0000 UTC m=+0.201634008 container attach 5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ardinghelli, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7) Oct 14 05:23:36 localhost systemd[1]: var-lib-containers-storage-overlay-2fda1a74f7cc55c07f0966ab5df1e014af2f698094cbf51a0610cfa3479b961d-merged.mount: Deactivated successfully. Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: [ Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: { Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "available": false, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "ceph_device": false, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "lsm_data": {}, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "lvs": [], Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "path": "/dev/sr0", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "rejected_reasons": [ Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "Has a FileSystem", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "Insufficient space (<5GB)" Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: ], Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "sys_api": { Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "actuators": null, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "device_nodes": "sr0", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "human_readable_size": "482.00 KB", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "id_bus": "ata", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "model": "QEMU DVD-ROM", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "nr_requests": "2", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "partitions": {}, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "path": "/dev/sr0", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "removable": "1", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "rev": "2.5+", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "ro": "0", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "rotational": "1", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "sas_address": "", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "sas_device_handle": "", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "scheduler_mode": "mq-deadline", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "sectors": 0, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "sectorsize": "2048", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "size": 493568.0, Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "support_discard": "0", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "type": "disk", Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: "vendor": "QEMU" Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: } Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: } Oct 14 05:23:36 localhost admiring_ardinghelli[127285]: ] Oct 14 05:23:36 localhost systemd[1]: libpod-5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361.scope: Deactivated successfully. Oct 14 05:23:36 localhost podman[127263]: 2025-10-14 09:23:36.994377753 +0000 UTC m=+1.204443936 container died 5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ardinghelli, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc.) Oct 14 05:23:37 localhost systemd[1]: var-lib-containers-storage-overlay-8bf10646f5a374f5204c2d3b010a2d12c03485a4fd9becc8c50a248c7b70b3a6-merged.mount: Deactivated successfully. Oct 14 05:23:37 localhost podman[128905]: 2025-10-14 09:23:37.095265799 +0000 UTC m=+0.089367180 container remove 5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ardinghelli, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True) Oct 14 05:23:37 localhost systemd[1]: libpod-conmon-5d12f128834dca8b3e517645e648cabfd4cff6c454f998ea6152bab9da714361.scope: Deactivated successfully. Oct 14 05:23:37 localhost python3.9[128997]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:23:38 localhost python3.9[129117]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:23:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40043 DF PROTO=TCP SPT=49714 DPT=9105 SEQ=4189340479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFAC75E0000000001030307) Oct 14 05:23:38 localhost python3.9[129190]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1760433817.8759844-721-250282642771823/.source.json _original_basename=.ckhb9nku follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40044 DF PROTO=TCP SPT=49714 DPT=9105 SEQ=4189340479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFACB5A0000000001030307) Oct 14 05:23:40 localhost python3.9[129282]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Oct 14 05:23:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40045 DF PROTO=TCP SPT=49714 DPT=9105 SEQ=4189340479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFAD35A0000000001030307) Oct 14 05:23:45 localhost podman[129295]: 2025-10-14 09:23:40.183293339 +0000 UTC m=+0.034915432 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Oct 14 05:23:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40046 DF PROTO=TCP SPT=49714 DPT=9105 SEQ=4189340479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFAE31A0000000001030307) Oct 14 05:23:47 localhost python3.9[129495]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Oct 14 05:23:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26103 DF PROTO=TCP SPT=45288 DPT=9101 SEQ=4118393183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFAEBDA0000000001030307) Oct 14 05:23:53 localhost sshd[129546]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52023 DF PROTO=TCP SPT=37822 DPT=9102 SEQ=3970162881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB00E50000000001030307) Oct 14 05:23:53 localhost sshd[129548]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52024 DF PROTO=TCP SPT=37822 DPT=9102 SEQ=3970162881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB04DA0000000001030307) Oct 14 05:23:55 localhost podman[129508]: 2025-10-14 09:23:47.439245567 +0000 UTC m=+0.038635537 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Oct 14 05:23:56 localhost sshd[129698]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:56 localhost python3.9[129714]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Oct 14 05:23:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21691 DF PROTO=TCP SPT=52828 DPT=9882 SEQ=3289358937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB13AB0000000001030307) Oct 14 05:23:58 localhost podman[129729]: 2025-10-14 09:23:56.913026386 +0000 UTC m=+0.044022974 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Oct 14 05:23:58 localhost sshd[129753]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:23:59 localhost python3.9[129893]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Oct 14 05:24:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52026 DF PROTO=TCP SPT=37822 DPT=9102 SEQ=3970162881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB1C9A0000000001030307) Oct 14 05:24:01 localhost podman[129907]: 2025-10-14 09:24:00.070126146 +0000 UTC m=+0.034921062 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 05:24:02 localhost python3.9[130071]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Oct 14 05:24:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21694 DF PROTO=TCP SPT=52828 DPT=9882 SEQ=3289358937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB2F5A0000000001030307) Oct 14 05:24:06 localhost sshd[130122]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:08 localhost podman[130084]: 2025-10-14 09:24:02.649466848 +0000 UTC m=+0.041588270 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Oct 14 05:24:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26325 DF PROTO=TCP SPT=37104 DPT=9105 SEQ=389726677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB3C8E0000000001030307) Oct 14 05:24:09 localhost python3.9[130496]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Oct 14 05:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26326 DF PROTO=TCP SPT=37104 DPT=9105 SEQ=389726677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB409A0000000001030307) Oct 14 05:24:11 localhost sshd[130557]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:11 localhost podman[130508]: 2025-10-14 09:24:09.408788669 +0000 UTC m=+0.039689020 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Oct 14 05:24:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26327 DF PROTO=TCP SPT=37104 DPT=9105 SEQ=389726677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB489A0000000001030307) Oct 14 05:24:11 localhost systemd[1]: session-39.scope: Deactivated successfully. Oct 14 05:24:11 localhost systemd[1]: session-39.scope: Consumed 1min 38.977s CPU time. Oct 14 05:24:11 localhost systemd-logind[760]: Session 39 logged out. Waiting for processes to exit. Oct 14 05:24:11 localhost systemd-logind[760]: Removed session 39. Oct 14 05:24:13 localhost sshd[130880]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26328 DF PROTO=TCP SPT=37104 DPT=9105 SEQ=389726677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB585A0000000001030307) Oct 14 05:24:17 localhost sshd[130892]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:17 localhost systemd-logind[760]: New session 40 of user zuul. Oct 14 05:24:17 localhost systemd[1]: Started Session 40 of User zuul. Oct 14 05:24:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43812 DF PROTO=TCP SPT=40936 DPT=9101 SEQ=1390865219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB611B0000000001030307) Oct 14 05:24:18 localhost sshd[131118]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:19 localhost python3.9[131117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:24:19 localhost sshd[131124]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:22 localhost python3.9[131217]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Oct 14 05:24:23 localhost sshd[131233]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51601 DF PROTO=TCP SPT=57290 DPT=9102 SEQ=3075239440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB76150000000001030307) Oct 14 05:24:24 localhost python3.9[131312]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:24:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51602 DF PROTO=TCP SPT=57290 DPT=9102 SEQ=3075239440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB7A1A0000000001030307) Oct 14 05:24:25 localhost python3.9[131366]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:24:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14971 DF PROTO=TCP SPT=45670 DPT=9882 SEQ=589167196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB88DB0000000001030307) Oct 14 05:24:30 localhost python3.9[131460]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:24:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51604 DF PROTO=TCP SPT=57290 DPT=9102 SEQ=3075239440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFB91DA0000000001030307) Oct 14 05:24:31 localhost sshd[131463]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:34 localhost python3.9[131556]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:24:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14974 DF PROTO=TCP SPT=45670 DPT=9882 SEQ=589167196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBA49A0000000001030307) Oct 14 05:24:36 localhost python3.9[131649]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:24:37 localhost sshd[131696]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:37 localhost python3.9[131743]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Oct 14 05:24:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58550 DF PROTO=TCP SPT=38156 DPT=9105 SEQ=1952435620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBB1BE0000000001030307) Oct 14 05:24:39 localhost kernel: SELinux: Converting 2755 SID table entries... Oct 14 05:24:39 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:24:39 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:24:39 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:24:39 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:24:39 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:24:39 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:24:39 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:24:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58551 DF PROTO=TCP SPT=38156 DPT=9105 SEQ=1952435620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBB5DA0000000001030307) Oct 14 05:24:41 localhost python3.9[131925]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:24:41 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=18 res=1 Oct 14 05:24:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58552 DF PROTO=TCP SPT=38156 DPT=9105 SEQ=1952435620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBBDDA0000000001030307) Oct 14 05:24:42 localhost python3.9[132038]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:24:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58553 DF PROTO=TCP SPT=38156 DPT=9105 SEQ=1952435620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBCD9B0000000001030307) Oct 14 05:24:46 localhost python3.9[132132]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:24:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16974 DF PROTO=TCP SPT=38312 DPT=9101 SEQ=3659050790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBD65A0000000001030307) Oct 14 05:24:48 localhost python3.9[132377]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:24:49 localhost python3.9[132467]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:24:49 localhost sshd[132470]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:24:49 localhost python3.9[132563]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:24:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61016 DF PROTO=TCP SPT=40408 DPT=9102 SEQ=1944912368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBEB440000000001030307) Oct 14 05:24:53 localhost python3.9[132657]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:24:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61017 DF PROTO=TCP SPT=40408 DPT=9102 SEQ=1944912368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBEF5A0000000001030307) Oct 14 05:24:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41086 DF PROTO=TCP SPT=49894 DPT=9882 SEQ=3702493230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFBFE0B0000000001030307) Oct 14 05:24:58 localhost python3.9[132751]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Oct 14 05:24:58 localhost systemd[1]: Reloading. Oct 14 05:24:58 localhost systemd-rc-local-generator[132780]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:24:58 localhost systemd-sysv-generator[132784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:24:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:24:59 localhost python3.9[132883]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:25:00 localhost python3.9[132975]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61019 DF PROTO=TCP SPT=40408 DPT=9102 SEQ=1944912368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC071A0000000001030307) Oct 14 05:25:01 localhost python3.9[133069]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:01 localhost python3.9[133161]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:02 localhost python3.9[133253]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:25:03 localhost python3.9[133326]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433902.3071866-565-73465197832298/.source _original_basename=.fhlk9ran follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:04 localhost python3.9[133418]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:05 localhost python3.9[133510]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Oct 14 05:25:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41089 DF PROTO=TCP SPT=49894 DPT=9882 SEQ=3702493230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC19DB0000000001030307) Oct 14 05:25:05 localhost python3.9[133602]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:06 localhost python3.9[133694]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:25:07 localhost python3.9[133767]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433906.3337207-691-174263390211924/.source.yaml _original_basename=.66zhorp9 follow=False checksum=4c28d1662755c608a6ffaa942e27a2488c0a78a3 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:08 localhost python3.9[133859]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Oct 14 05:25:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27578 DF PROTO=TCP SPT=43202 DPT=9105 SEQ=3565482225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC26F00000000001030307) Oct 14 05:25:09 localhost ansible-async_wrapper.py[133964]: Invoked with j406162820818 300 /home/zuul/.ansible/tmp/ansible-tmp-1760433908.7580779-763-127519515532900/AnsiballZ_edpm_os_net_config.py _ Oct 14 05:25:09 localhost ansible-async_wrapper.py[133967]: Starting module and watcher Oct 14 05:25:09 localhost ansible-async_wrapper.py[133967]: Start watching 133968 (300) Oct 14 05:25:09 localhost ansible-async_wrapper.py[133968]: Start module (133968) Oct 14 05:25:09 localhost ansible-async_wrapper.py[133964]: Return async_wrapper task started. Oct 14 05:25:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27579 DF PROTO=TCP SPT=43202 DPT=9105 SEQ=3565482225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC2ADA0000000001030307) Oct 14 05:25:09 localhost python3.9[133969]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Oct 14 05:25:10 localhost ansible-async_wrapper.py[133968]: Module complete (133968) Oct 14 05:25:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27580 DF PROTO=TCP SPT=43202 DPT=9105 SEQ=3565482225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC32DA0000000001030307) Oct 14 05:25:13 localhost python3.9[134061]: ansible-ansible.legacy.async_status Invoked with jid=j406162820818.133964 mode=status _async_dir=/root/.ansible_async Oct 14 05:25:13 localhost python3.9[134120]: ansible-ansible.legacy.async_status Invoked with jid=j406162820818.133964 mode=cleanup _async_dir=/root/.ansible_async Oct 14 05:25:14 localhost python3.9[134212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:25:14 localhost ansible-async_wrapper.py[133967]: Done in kid B. Oct 14 05:25:15 localhost python3.9[134285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433914.0197887-829-277687224662331/.source.returncode _original_basename=._sjd9dzs follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:15 localhost python3.9[134377]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:25:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27581 DF PROTO=TCP SPT=43202 DPT=9105 SEQ=3565482225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC429B0000000001030307) Oct 14 05:25:16 localhost python3.9[134450]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433915.241236-877-110283099243759/.source.cfg _original_basename=.pxm7mhi0 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:17 localhost python3.9[134542]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:25:17 localhost systemd[1]: Reloading Network Manager... Oct 14 05:25:17 localhost NetworkManager[5977]: [1760433917.3087] audit: op="reload" arg="0" pid=134546 uid=0 result="success" Oct 14 05:25:17 localhost NetworkManager[5977]: [1760433917.3102] config: signal: SIGHUP (no changes from disk) Oct 14 05:25:17 localhost systemd[1]: Reloaded Network Manager. Oct 14 05:25:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12685 DF PROTO=TCP SPT=51580 DPT=9101 SEQ=3343649792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC4B9A0000000001030307) Oct 14 05:25:18 localhost systemd[1]: session-40.scope: Deactivated successfully. Oct 14 05:25:18 localhost systemd[1]: session-40.scope: Consumed 37.638s CPU time. Oct 14 05:25:18 localhost systemd-logind[760]: Session 40 logged out. Waiting for processes to exit. Oct 14 05:25:18 localhost systemd-logind[760]: Removed session 40. Oct 14 05:25:23 localhost sshd[134561]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:25:23 localhost systemd-logind[760]: New session 41 of user zuul. Oct 14 05:25:23 localhost systemd[1]: Started Session 41 of User zuul. Oct 14 05:25:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13117 DF PROTO=TCP SPT=34942 DPT=9102 SEQ=4228449663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC60750000000001030307) Oct 14 05:25:24 localhost python3.9[134654]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:25:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13118 DF PROTO=TCP SPT=34942 DPT=9102 SEQ=4228449663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC649A0000000001030307) Oct 14 05:25:25 localhost python3.9[134748]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:25:26 localhost python3.9[134901]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:25:27 localhost systemd[1]: session-41.scope: Deactivated successfully. Oct 14 05:25:27 localhost systemd[1]: session-41.scope: Consumed 2.147s CPU time. Oct 14 05:25:27 localhost systemd-logind[760]: Session 41 logged out. Waiting for processes to exit. Oct 14 05:25:27 localhost systemd-logind[760]: Removed session 41. Oct 14 05:25:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32531 DF PROTO=TCP SPT=48590 DPT=9882 SEQ=4030038001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC733B0000000001030307) Oct 14 05:25:29 localhost sshd[134917]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:25:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13120 DF PROTO=TCP SPT=34942 DPT=9102 SEQ=4228449663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC7C5B0000000001030307) Oct 14 05:25:32 localhost sshd[134919]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:25:32 localhost systemd-logind[760]: New session 42 of user zuul. Oct 14 05:25:32 localhost systemd[1]: Started Session 42 of User zuul. Oct 14 05:25:33 localhost sshd[135013]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:25:34 localhost python3.9[135012]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:25:35 localhost python3.9[135108]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:25:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32534 DF PROTO=TCP SPT=48590 DPT=9882 SEQ=4030038001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC8F1A0000000001030307) Oct 14 05:25:36 localhost python3.9[135204]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:25:37 localhost python3.9[135258]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:25:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58681 DF PROTO=TCP SPT=42068 DPT=9105 SEQ=682831893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFC9C1E0000000001030307) Oct 14 05:25:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58682 DF PROTO=TCP SPT=42068 DPT=9105 SEQ=682831893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCA01A0000000001030307) Oct 14 05:25:41 localhost python3.9[135352]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:25:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58683 DF PROTO=TCP SPT=42068 DPT=9105 SEQ=682831893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCA81A0000000001030307) Oct 14 05:25:42 localhost python3.9[135568]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:43 localhost python3.9[135711]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:25:44 localhost python3.9[135829]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:25:44 localhost python3.9[135877]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:25:45 localhost python3.9[135969]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:25:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58684 DF PROTO=TCP SPT=42068 DPT=9105 SEQ=682831893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCB7DB0000000001030307) Oct 14 05:25:46 localhost python3.9[136017]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:25:46 localhost python3.9[136109]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:25:47 localhost python3.9[136201]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:25:48 localhost python3.9[136293]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:25:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35008 DF PROTO=TCP SPT=51862 DPT=9101 SEQ=1108655827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCC09B0000000001030307) Oct 14 05:25:48 localhost python3.9[136385]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Oct 14 05:25:49 localhost python3.9[136477]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:25:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40545 DF PROTO=TCP SPT=37166 DPT=9102 SEQ=148792024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCD5A40000000001030307) Oct 14 05:25:54 localhost python3.9[136571]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:25:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40546 DF PROTO=TCP SPT=37166 DPT=9102 SEQ=148792024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCD99A0000000001030307) Oct 14 05:25:54 localhost python3.9[136665]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:25:55 localhost python3.9[136757]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:25:56 localhost python3.9[136849]: ansible-service_facts Invoked Oct 14 05:25:56 localhost network[136866]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:25:56 localhost network[136867]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:25:56 localhost network[136868]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:25:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:25:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36338 DF PROTO=TCP SPT=54826 DPT=9882 SEQ=1989169476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCE86B0000000001030307) Oct 14 05:26:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40548 DF PROTO=TCP SPT=37166 DPT=9102 SEQ=148792024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFCF15A0000000001030307) Oct 14 05:26:02 localhost python3.9[137189]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:26:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36341 DF PROTO=TCP SPT=54826 DPT=9882 SEQ=1989169476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD041A0000000001030307) Oct 14 05:26:06 localhost python3.9[137283]: ansible-package_facts Invoked with manager=['auto'] strategy=first Oct 14 05:26:08 localhost python3.9[137375]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7531 DF PROTO=TCP SPT=51948 DPT=9105 SEQ=2569012647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD114E0000000001030307) Oct 14 05:26:09 localhost python3.9[137450]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433968.0144029-622-213801411880159/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7532 DF PROTO=TCP SPT=51948 DPT=9105 SEQ=2569012647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD155A0000000001030307) Oct 14 05:26:10 localhost python3.9[137544]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:10 localhost python3.9[137619]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433969.606966-669-46164799922148/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7533 DF PROTO=TCP SPT=51948 DPT=9105 SEQ=2569012647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD1D5A0000000001030307) Oct 14 05:26:12 localhost python3.9[137713]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:13 localhost python3.9[137807]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:26:15 localhost python3.9[137861]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:26:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7534 DF PROTO=TCP SPT=51948 DPT=9105 SEQ=2569012647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD2D1A0000000001030307) Oct 14 05:26:17 localhost python3.9[137955]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:26:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38161 DF PROTO=TCP SPT=34162 DPT=9101 SEQ=2801523649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD35DA0000000001030307) Oct 14 05:26:18 localhost python3.9[138009]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:26:18 localhost systemd[1]: Stopping NTP client/server... Oct 14 05:26:18 localhost chronyd[25772]: chronyd exiting Oct 14 05:26:18 localhost systemd[1]: chronyd.service: Deactivated successfully. Oct 14 05:26:18 localhost systemd[1]: Stopped NTP client/server. Oct 14 05:26:18 localhost systemd[1]: Starting NTP client/server... Oct 14 05:26:18 localhost chronyd[138017]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Oct 14 05:26:18 localhost chronyd[138017]: Frequency -30.617 +/- 0.207 ppm read from /var/lib/chrony/drift Oct 14 05:26:18 localhost chronyd[138017]: Loaded seccomp filter (level 2) Oct 14 05:26:18 localhost systemd[1]: Started NTP client/server. Oct 14 05:26:19 localhost systemd[1]: session-42.scope: Deactivated successfully. Oct 14 05:26:19 localhost systemd[1]: session-42.scope: Consumed 28.847s CPU time. Oct 14 05:26:19 localhost systemd-logind[760]: Session 42 logged out. Waiting for processes to exit. Oct 14 05:26:19 localhost systemd-logind[760]: Removed session 42. Oct 14 05:26:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59623 DF PROTO=TCP SPT=49802 DPT=9102 SEQ=3880577034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD4AD50000000001030307) Oct 14 05:26:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59624 DF PROTO=TCP SPT=49802 DPT=9102 SEQ=3880577034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD4EDA0000000001030307) Oct 14 05:26:24 localhost sshd[138034]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:24 localhost systemd-logind[760]: New session 43 of user zuul. Oct 14 05:26:24 localhost systemd[1]: Started Session 43 of User zuul. Oct 14 05:26:25 localhost python3.9[138127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:26:26 localhost python3.9[138223]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:27 localhost python3.9[138328]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39126 DF PROTO=TCP SPT=57708 DPT=9882 SEQ=3554190716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD5D9B0000000001030307) Oct 14 05:26:28 localhost python3.9[138376]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.m_xemq7y recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:29 localhost python3.9[138468]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:29 localhost auditd[722]: Audit daemon rotating log files Oct 14 05:26:29 localhost python3.9[138543]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433988.7985191-145-112017515524858/.source _original_basename=.7h3di9ko follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59626 DF PROTO=TCP SPT=49802 DPT=9102 SEQ=3880577034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD669A0000000001030307) Oct 14 05:26:30 localhost python3.9[138635]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:26:31 localhost python3.9[138727]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:32 localhost python3.9[138800]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433990.9509616-217-87925072633825/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:26:32 localhost python3.9[138892]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:33 localhost python3.9[138965]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433992.2206-217-50651848681860/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:26:34 localhost python3.9[139057]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:34 localhost python3.9[139149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39129 DF PROTO=TCP SPT=57708 DPT=9882 SEQ=3554190716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD795B0000000001030307) Oct 14 05:26:35 localhost python3.9[139222]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760433994.212846-328-236075153929868/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:36 localhost python3.9[139314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:36 localhost python3.9[139387]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760433995.6158752-373-264048413752594/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:37 localhost python3.9[139479]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:26:37 localhost systemd[1]: Reloading. Oct 14 05:26:37 localhost systemd-rc-local-generator[139507]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:26:37 localhost systemd-sysv-generator[139510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:26:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:26:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64470 DF PROTO=TCP SPT=41922 DPT=9105 SEQ=1317047055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD867E0000000001030307) Oct 14 05:26:39 localhost systemd[1]: Reloading. Oct 14 05:26:39 localhost systemd-sysv-generator[139549]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:26:39 localhost systemd-rc-local-generator[139545]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:26:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:26:39 localhost systemd[1]: Starting EDPM Container Shutdown... Oct 14 05:26:39 localhost systemd[1]: Finished EDPM Container Shutdown. Oct 14 05:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64471 DF PROTO=TCP SPT=41922 DPT=9105 SEQ=1317047055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD8A9A0000000001030307) Oct 14 05:26:40 localhost python3.9[139649]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:40 localhost python3.9[139722]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760433999.790644-442-223726455440350/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:41 localhost python3.9[139814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:41 localhost sshd[139868]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64472 DF PROTO=TCP SPT=41922 DPT=9105 SEQ=1317047055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFD929A0000000001030307) Oct 14 05:26:42 localhost python3.9[139889]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434001.077056-487-109945914302761/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:42 localhost python3.9[139981]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:26:42 localhost systemd[1]: Reloading. Oct 14 05:26:43 localhost systemd-rc-local-generator[140008]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:26:43 localhost systemd-sysv-generator[140011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:26:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:26:43 localhost systemd[1]: Starting Create netns directory... Oct 14 05:26:43 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:26:43 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:26:43 localhost systemd[1]: Finished Create netns directory. Oct 14 05:26:44 localhost python3.9[140143]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:26:44 localhost network[140162]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:26:44 localhost network[140168]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:26:44 localhost network[140169]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:26:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:26:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64473 DF PROTO=TCP SPT=41922 DPT=9105 SEQ=1317047055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDA25B0000000001030307) Oct 14 05:26:47 localhost sshd[140365]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:47 localhost python3.9[140412]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:26:48 localhost sshd[140437]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41407 DF PROTO=TCP SPT=57150 DPT=9101 SEQ=1305423641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDAB1B0000000001030307) Oct 14 05:26:48 localhost python3.9[140489]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434007.443597-611-148216699297760/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:26:49 localhost python3.9[140580]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:26:51 localhost sshd[140596]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39194 DF PROTO=TCP SPT=57162 DPT=9102 SEQ=1471862947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDC0050000000001030307) Oct 14 05:26:53 localhost sshd[140598]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:54 localhost sshd[140600]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39195 DF PROTO=TCP SPT=57162 DPT=9102 SEQ=1471862947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDC41A0000000001030307) Oct 14 05:26:54 localhost sshd[140602]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:26:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30184 DF PROTO=TCP SPT=41730 DPT=9882 SEQ=1686927018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDD2CB0000000001030307) Oct 14 05:27:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39197 DF PROTO=TCP SPT=57162 DPT=9102 SEQ=1471862947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDDBDA0000000001030307) Oct 14 05:27:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30187 DF PROTO=TCP SPT=41730 DPT=9882 SEQ=1686927018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDEE9A0000000001030307) Oct 14 05:27:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4025 DF PROTO=TCP SPT=57976 DPT=9105 SEQ=3032160290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDFBAE0000000001030307) Oct 14 05:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4026 DF PROTO=TCP SPT=57976 DPT=9105 SEQ=3032160290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFDFF9A0000000001030307) Oct 14 05:27:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4027 DF PROTO=TCP SPT=57976 DPT=9105 SEQ=3032160290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE079A0000000001030307) Oct 14 05:27:14 localhost systemd[1]: session-43.scope: Deactivated successfully. Oct 14 05:27:14 localhost systemd[1]: session-43.scope: Consumed 14.504s CPU time. Oct 14 05:27:14 localhost systemd-logind[760]: Session 43 logged out. Waiting for processes to exit. Oct 14 05:27:14 localhost systemd-logind[760]: Removed session 43. Oct 14 05:27:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4028 DF PROTO=TCP SPT=57976 DPT=9105 SEQ=3032160290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE175A0000000001030307) Oct 14 05:27:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8345 DF PROTO=TCP SPT=40976 DPT=9101 SEQ=1626619357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE205A0000000001030307) Oct 14 05:27:19 localhost sshd[140618]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32764 DF PROTO=TCP SPT=41198 DPT=9102 SEQ=2069327885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE35340000000001030307) Oct 14 05:27:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32765 DF PROTO=TCP SPT=41198 DPT=9102 SEQ=2069327885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE395A0000000001030307) Oct 14 05:27:26 localhost sshd[140620]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:26 localhost systemd-logind[760]: New session 44 of user zuul. Oct 14 05:27:26 localhost systemd[1]: Started Session 44 of User zuul. Oct 14 05:27:27 localhost python3.9[140713]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:27:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9784 DF PROTO=TCP SPT=46622 DPT=9882 SEQ=1475290560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE47FA0000000001030307) Oct 14 05:27:29 localhost python3.9[140809]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:30 localhost python3.9[140914]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32767 DF PROTO=TCP SPT=41198 DPT=9102 SEQ=2069327885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE511A0000000001030307) Oct 14 05:27:30 localhost python3.9[140962]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.nly657g_ recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:31 localhost python3.9[141054]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:32 localhost python3.9[141102]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.5p_9ck9i recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:32 localhost python3.9[141194]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:27:33 localhost python3.9[141286]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:34 localhost python3.9[141334]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:27:34 localhost python3.9[141426]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:35 localhost python3.9[141474]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:27:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9787 DF PROTO=TCP SPT=46622 DPT=9882 SEQ=1475290560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE63DB0000000001030307) Oct 14 05:27:35 localhost python3.9[141566]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:36 localhost python3.9[141658]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:36 localhost python3.9[141706]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:37 localhost python3.9[141798]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:38 localhost python3.9[141846]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4801 DF PROTO=TCP SPT=58902 DPT=9105 SEQ=1459006149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE70DE0000000001030307) Oct 14 05:27:39 localhost python3.9[141938]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:27:39 localhost systemd[1]: Reloading. Oct 14 05:27:39 localhost systemd-sysv-generator[141966]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:27:39 localhost systemd-rc-local-generator[141962]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:27:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4802 DF PROTO=TCP SPT=58902 DPT=9105 SEQ=1459006149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE74DA0000000001030307) Oct 14 05:27:40 localhost python3.9[142068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:40 localhost python3.9[142116]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:41 localhost sshd[142204]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:41 localhost python3.9[142210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4803 DF PROTO=TCP SPT=58902 DPT=9105 SEQ=1459006149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE7CDA0000000001030307) Oct 14 05:27:42 localhost python3.9[142258]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:42 localhost python3.9[142350]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:27:42 localhost systemd[1]: Reloading. Oct 14 05:27:42 localhost systemd-rc-local-generator[142374]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:27:42 localhost systemd-sysv-generator[142379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:27:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:27:43 localhost systemd[1]: Starting Create netns directory... Oct 14 05:27:43 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:27:43 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:27:43 localhost systemd[1]: Finished Create netns directory. Oct 14 05:27:43 localhost python3.9[142481]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:27:44 localhost network[142498]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:27:44 localhost network[142499]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:27:44 localhost network[142500]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:27:45 localhost sshd[142533]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:27:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4804 DF PROTO=TCP SPT=58902 DPT=9105 SEQ=1459006149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE8C9A0000000001030307) Oct 14 05:27:46 localhost podman[142659]: 2025-10-14 09:27:46.275150023 +0000 UTC m=+0.093024672 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55) Oct 14 05:27:46 localhost podman[142659]: 2025-10-14 09:27:46.403298443 +0000 UTC m=+0.221173192 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , RELEASE=main, name=rhceph, release=553, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 05:27:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1755 DF PROTO=TCP SPT=59000 DPT=9101 SEQ=1834560162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFE955A0000000001030307) Oct 14 05:27:50 localhost sshd[142916]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:51 localhost python3.9[142950]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:51 localhost python3.9[142998]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:52 localhost python3.9[143090]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:52 localhost sshd[143110]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:53 localhost python3.9[143184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:53 localhost sshd[143185]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58877 DF PROTO=TCP SPT=41612 DPT=9102 SEQ=2410047873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEAA650000000001030307) Oct 14 05:27:53 localhost python3.9[143259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434072.5455515-611-249467767860838/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58878 DF PROTO=TCP SPT=41612 DPT=9102 SEQ=2410047873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEAE5B0000000001030307) Oct 14 05:27:54 localhost python3.9[143351]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Oct 14 05:27:54 localhost systemd[1]: Starting Time & Date Service... Oct 14 05:27:55 localhost systemd[1]: Started Time & Date Service. Oct 14 05:27:55 localhost python3.9[143447]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:56 localhost python3.9[143539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:57 localhost python3.9[143612]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434076.0325718-716-35197534641038/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:57 localhost python3.9[143704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:58 localhost python3.9[143777]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434077.2845905-761-267569066791609/.source.yaml _original_basename=.vkc5yik8 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:27:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35224 DF PROTO=TCP SPT=44790 DPT=9882 SEQ=4110298892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEBD2B0000000001030307) Oct 14 05:27:58 localhost sshd[143870]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:27:58 localhost python3.9[143869]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:27:59 localhost python3.9[143946]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434078.4673314-806-201277621082737/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58880 DF PROTO=TCP SPT=41612 DPT=9102 SEQ=2410047873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEC61A0000000001030307) Oct 14 05:28:00 localhost python3.9[144038]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:28:01 localhost python3.9[144131]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:28:01 localhost sshd[144147]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:02 localhost python3[144226]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Oct 14 05:28:03 localhost python3.9[144318]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:28:03 localhost python3.9[144391]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434082.5430715-923-203580329841056/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:04 localhost python3.9[144483]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:28:04 localhost python3.9[144556]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434083.7990537-968-182779633487317/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:05 localhost python3.9[144648]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:28:06 localhost python3.9[144721]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434085.1426616-1013-32583965669504/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:06 localhost python3.9[144813]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:28:07 localhost sshd[144887]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:07 localhost python3.9[144886]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434086.4268394-1058-124578363502570/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:08 localhost python3.9[144980]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:28:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18999 DF PROTO=TCP SPT=36386 DPT=9105 SEQ=1828088164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEE60E0000000001030307) Oct 14 05:28:08 localhost python3.9[145053]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434087.701961-1103-171182303863445/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:08 localhost sshd[145068]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:09 localhost python3.9[145147]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24708 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=3451486365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEEBF50000000001030307) Oct 14 05:28:10 localhost python3.9[145239]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:28:10 localhost sshd[145257]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44094 DF PROTO=TCP SPT=33376 DPT=9101 SEQ=3289306042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFEEED90000000001030307) Oct 14 05:28:11 localhost python3.9[145336]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:12 localhost sshd[145430]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:12 localhost python3.9[145429]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:12 localhost python3.9[145523]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:13 localhost python3.9[145615]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Oct 14 05:28:14 localhost python3.9[145708]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Oct 14 05:28:14 localhost systemd[1]: session-44.scope: Deactivated successfully. Oct 14 05:28:14 localhost systemd[1]: session-44.scope: Consumed 28.169s CPU time. Oct 14 05:28:14 localhost systemd-logind[760]: Session 44 logged out. Waiting for processes to exit. Oct 14 05:28:14 localhost systemd-logind[760]: Removed session 44. Oct 14 05:28:20 localhost sshd[145724]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:20 localhost systemd-logind[760]: New session 45 of user zuul. Oct 14 05:28:20 localhost systemd[1]: Started Session 45 of User zuul. Oct 14 05:28:20 localhost python3.9[145819]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Oct 14 05:28:22 localhost python3.9[145911]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:28:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10407 DF PROTO=TCP SPT=46376 DPT=9102 SEQ=3196039572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF1F950000000001030307) Oct 14 05:28:23 localhost python3.9[146005]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Oct 14 05:28:25 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Oct 14 05:28:25 localhost python3.9[146097]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.eedehguz follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:28:25 localhost python3.9[146174]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.eedehguz mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434104.7681184-192-205422322331559/.source.eedehguz _original_basename=.d8i433nx follow=False checksum=3d1ed25b73f46d4ec79674ca0a766646d7ecfda1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:28 localhost python3.9[146266]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:28:28 localhost chronyd[138017]: Selected source 167.160.187.179 (pool.ntp.org) Oct 14 05:28:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64540 DF PROTO=TCP SPT=54744 DPT=9882 SEQ=2777638720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF325B0000000001030307) Oct 14 05:28:29 localhost python3.9[146358]: ansible-ansible.builtin.blockinfile Invoked with block=np0005486733.localdomain,192.168.122.108,np0005486733* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDPo0GfacWT5Pc+C+u+omIcLodqLCmBuNDNfCjeb037QgP4jmD3LwkBVK9lXeF6bKJmM0PzOPagPFh4T7FwHNF7Np+V7e+YWSARFeetHnxYmMZdWYyfKTaZrS25xRraxyGrunWniIhAKFUaTz7e6OjUqNe25eVURCgpvQnsWeDwm/Gk9GfpfMCIFRtF7phpUKzSaz/8IpyLG1IzRSMsUkEtoKFxbAkuuJrkD4IWeWvEqn02yWC2WFGEdpQu8kcnxIshwqf9bEa7rYrjDTR++5AuztTSbppQL+8RIclxDR3uCVxzprf9Pj2C0e2X7TVKUs1tlduvrPK7uS10NGx3CK5iUe+uX+4V+jNrpe35OBv2vzdbzR+W6ciNtdy2lWLTou66Fm+/a3XwfJQb66dWQrLIyc6T64D8BysHjA8ER5TZ7N8AZoFZ8tNRzPgNWFZhjzoXdYisTvN9CjcpLgVpzekjeQS4BNNzh7bs+FPdB49TSf65NLzBIhWNqHT8weDoO58=#012np0005486733.localdomain,192.168.122.108,np0005486733* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBuLxBOOZ2E9wHKjXOMebj4OZ7Ol59V1QC+zoNcmtlAO#012np0005486733.localdomain,192.168.122.108,np0005486733* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFER+xRcpNFhmN1n2ALXYX9o2Jz+2SveGOJaTigZLIqTfd5sCQS0J9/MB5gF5Mfkep3gloJeQ8cIc77b5oI9Z0M=#012np0005486728.localdomain,192.168.122.103,np0005486728* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDr2nlXCVxp/8oDgdtx78rfaKpbpZ2BVPZ6HGLZUj0EA3A0bpv/vCkjK3KQT3TI7v1XfpgRbj08G0BbDhcTce9c8drn6X7lMpxvdMYZKKMTHnRs3mq9RsfEuWH3Q8Aa22LiA7rLwzVM2bbdbUcx/55pt3si8ariZ274Pzbprq7RrthEdE9xo5SDFIi+VJNQfQa+igaLblAAoG8qz+WChOAEmghfOAe4F7vBmidVxT92aYUE03zpWtqox4fE1U2dC0FMJ6Jro1ONj8KKCyEL+oLEbWFbPR4ynCyRvGaMIYh+9scB5yCf7vgPXNqu8sG+gR9i5wG43Nnh+76+XX/k+4Vyw/VeNANTjdiGvBcWmj1LLMDetoxZ5AdfklGaQq5qmrIvGqvIAGd7NgdwwWWw2umuIru3mi/5Z0H5I1uhLgTdknibTJSkhkkt/sBiBuyAXM3/HneFzlxDlYgA1xwdZeNnfiH010AO2W8pkWmWsYdMOEOBsM3SmGWtUuGKApwHcs8=#012np0005486728.localdomain,192.168.122.103,np0005486728* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFZgIdoXZ49/AzXU+oHb4E9FVVTK2wJq4yrcPHjFQfqz#012np0005486728.localdomain,192.168.122.103,np0005486728* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMFbWHTiTeGA1XRQ7iWIJKpfCQXOTyNXNwCMjLTErss66DUcnzonE/JU1XrsOoRs149r+P1WVqvqD4ixVbvoNVw=#012np0005486729.localdomain,192.168.122.104,np0005486729* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCuTpRqp6mqKsQmynNLG8q8Bb4GSKNLRdYVfi81dV1W3aIPFsswo/C9+5nbZA1YVPY02cdXFps4EmIQl2tQ0sKmdo4HGexnhUJjKuyXFTu0kCYUasXCE5+sSjRVUCF4RfD3+6jQ9w6hHM1R3JkkhPZtKs4ykqH+8Gr2B918BdDuVaujfMmVWMv8M46JDuDO9vGPlWpM+xZkFZ1zjG2I2UIvWLkEnVdta7QIgxIPTlX7rOokadGrkAcIYb87wONg2vJiTPWO4ht4yHUIvTGNHSTmCXK0sdQLiZzjR2P/k67s1KMeWjaWAe3NXygnpvgENx9Qf9NkOYhvz8j+xZXat4Pa/I38V79XAjE3nWEF/KM6a4nKK9Lz5GXOvsQ+LIXBBY6HSAqBY4Lc21xwCJxEoO5Iftn56HzDFA+iyex5FMeT12ANKmVF9D+NHdaiZ3d5iPW6cOPqph1UjWsofejhEt0dxmCbippl74SWTZey9dQ3TKM9BGf2QfH1GvasiC+CsVU=#012np0005486729.localdomain,192.168.122.104,np0005486729* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGxeq2D+w4uY3tKP5yQyoEBem0b2s2hPrJdTzpIuGozW#012np0005486729.localdomain,192.168.122.104,np0005486729* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEXD7o/mHlyE5FPGkrRBHPWn+AwId2YyEBOT/QILn8qgF7Mym76ZEJFAVw5zzuZA1ef4oRAHz26eg7bkU00wtUI=#012np0005486731.localdomain,192.168.122.106,np0005486731* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCirnE0NbUtG1POhhB+AhKCgxEghhJb/WUMq5UfTpoI7+sU48jNxRyEvlJ9WLGLD82QYzFzvYceQHGF3QzqwIybk7JFKNvYYEOkz9hG//Xjh6A/3qZ0QptW0dWlBpSs0CuOATe19vBa98AfD1qNMYOAwwjlRDvjVW17VALcKjVesDK4LNkVfCSX9cK7Gdd1LfEkwQwxiTTZeSd91DSx5XIm3hz9RcMpxpCgc3snA81FXTTb4G1v39rycXuWjjlp/2B4CRlgPrIb6u1X/hkN0uxSMiwMQG7fZladvZi8RTRyt2EmTR0l8f0eDeuN1gLfOFVlQSfj33xH8/2G2s4IUhbudf732i4GKxgy5WBMiH2DVHzoO7LGdKlYKRvxgNG8qx68hOAzHokMnmaHnKlTsXNPph6MD/ufoeHaEG35xMkewSoY70MzDny/Z9lllfTTs+Yi5YEO22s5EoS6KK9C1+WShW9TELIuj5X8P1VeD+LlKJIwbLQzEHLc1irbnJ2RgUc=#012np0005486731.localdomain,192.168.122.106,np0005486731* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKTfHN8Jmqa+7PsF3vFOpO1ETsyPHFXELxpBTIpPfddD#012np0005486731.localdomain,192.168.122.106,np0005486731* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO1jfw99DJUv0J3Z9AkynHW2Up1hO/BlEGnvsE92l9HdbBSEY1YHu6GqkahHkqrmTxGZ5NofIRR2e10OiKQQIV0=#012np0005486732.localdomain,192.168.122.107,np0005486732* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDM+kpIg8Y4xlC9n9pfBoVDeeU3WOfZT4Yf4ib8bb9MSMyOwJpLVbkpe3nLg73heYlLISwD3ojybTo9jDmNS7Pq+q5bGue4oqLk7f5B7IMwrmkfzjKYQpGMLL7FdErlDs6IP2jQ82E+uJ7M54Kv5g0rr+blVacsnYetzjJM26r3UcKTdOjJyIHuvQWa4IzNJRydr8s9//7Orf7269xlmVoqyAkcrhzcewCVeaK7VOrIcy3oKzOtwYpQmSxUumuX5rxE8KoCn4Ag0V3Mpp7hqN2xrry1hJN1J7yXSYaF1pc4MJKvCK6k0VqK4dY6CppsQvx2HW1s/Ib5UxJ/+JypjsqwYcSL7BSesfCtHtY8Tn1bbI+nm+nbMw1VIECq94FvZldDnxbaCQDP7dkFxqJaZebSFX+XAsRqJq4M8/rAm2gFUtCisiggasuEgfBfODBwb5+EYGNBCS/72Xs3b1h+hoMh0XCocdkTpzbr40FK6djLBdZXBAt7/Vwy0fTpC9G8H+s=#012np0005486732.localdomain,192.168.122.107,np0005486732* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIP18wQUsgo1UBda3H3zHF+IC2kyNZ51YCgvk01Gn/dse#012np0005486732.localdomain,192.168.122.107,np0005486732* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCuamP8L0Ru51uRKZu8uXCmbi6mkdIaPGzpAzsbiDGTvO4mQOVAysASx3inqIoCaiUKcwRI9OHoaL30bXMeCfgY=#012np0005486730.localdomain,192.168.122.105,np0005486730* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDtk5xAqdm3oDp772fF0Tcpwt7lZCIcJjfcDVjKALPT5gaSA/ogGG08ba03OQjSa4fktVIIeYQdRVzWIscOCoWMDa+vnXRStoi9DI+3rLz3nQvH190s8hPq6KxWR8DzGiqF8GwF1Kfuc7wz4c9jdElv6iWUfZuxCSLQfPSRYOw9IIII6knfTuRjQAIdmUJwnjN9K5n2n8rISg0VPd9kUHZR8jL+zFPsv5XkwfW/t5CEMmx6WG8w8Q6gY+yoeU4qINcRzFjKx/s6ParctRSYzJDPYEyhrgqQUesBDU4nyxRDpFilkeZI46TfqC9bG5bKTVfVy6qnAgkt4vg6buwszUTRdx6a0v68zWAwKGNAHRKS/HQ/CRe7CHYqsob7w41V4RvOtP5kz+dniINeT/K71sL3ZwcciRuGM10ayjaxBw7HOMJHi9RWrPWads3ubzTErcORb9mdWdlSomqfEGB8Ig/tKeFTipyN39TKKHLD+o6Tjnxqb3imMsE1kZWQOzHbFhE=#012np0005486730.localdomain,192.168.122.105,np0005486730* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDq3H3Fetnx28JUaDyUkNg0MiLRsl8k1oSo01bE4tTx4#012np0005486730.localdomain,192.168.122.105,np0005486730* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO3n1UwhGEzXCVrYMBza4JMt6lsbT42NITUCGasB/Q88juksY/4w67C7ec1FV7QYfygjevsjTj8uJGh0384TqeQ=#012 create=True mode=0644 path=/tmp/ansible.eedehguz state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:31 localhost python3.9[146450]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.eedehguz' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:28:32 localhost python3.9[146544]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.eedehguz state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:33 localhost systemd-logind[760]: Session 45 logged out. Waiting for processes to exit. Oct 14 05:28:33 localhost systemd[1]: session-45.scope: Deactivated successfully. Oct 14 05:28:33 localhost systemd[1]: session-45.scope: Consumed 4.295s CPU time. Oct 14 05:28:33 localhost systemd-logind[760]: Removed session 45. Oct 14 05:28:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34661 DF PROTO=TCP SPT=48760 DPT=9105 SEQ=659178881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF5B3E0000000001030307) Oct 14 05:28:39 localhost sshd[146559]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:39 localhost systemd-logind[760]: New session 46 of user zuul. Oct 14 05:28:39 localhost systemd[1]: Started Session 46 of User zuul. Oct 14 05:28:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9132 DF PROTO=TCP SPT=38776 DPT=9100 SEQ=3988462113 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF61250000000001030307) Oct 14 05:28:40 localhost python3.9[146652]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:28:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10870 DF PROTO=TCP SPT=42834 DPT=9101 SEQ=1736226574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF64090000000001030307) Oct 14 05:28:41 localhost python3.9[146748]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Oct 14 05:28:43 localhost python3.9[146842]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:28:44 localhost python3.9[146935]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:28:45 localhost python3.9[147028]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:28:46 localhost python3.9[147122]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:28:47 localhost python3.9[147217]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:28:47 localhost systemd-logind[760]: Session 46 logged out. Waiting for processes to exit. Oct 14 05:28:47 localhost systemd[1]: session-46.scope: Deactivated successfully. Oct 14 05:28:47 localhost systemd[1]: session-46.scope: Consumed 4.111s CPU time. Oct 14 05:28:47 localhost systemd-logind[760]: Removed session 46. Oct 14 05:28:52 localhost sshd[147310]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:28:52 localhost systemd-logind[760]: New session 47 of user zuul. Oct 14 05:28:52 localhost systemd[1]: Started Session 47 of User zuul. Oct 14 05:28:53 localhost python3.9[147403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:28:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17040 DF PROTO=TCP SPT=43318 DPT=9102 SEQ=3552275696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF94C50000000001030307) Oct 14 05:28:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17041 DF PROTO=TCP SPT=43318 DPT=9102 SEQ=3552275696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFF98DB0000000001030307) Oct 14 05:28:54 localhost python3.9[147499]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:28:55 localhost python3.9[147553]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Oct 14 05:28:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17042 DF PROTO=TCP SPT=43318 DPT=9102 SEQ=3552275696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFA0DA0000000001030307) Oct 14 05:28:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44115 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=548636177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFA78B0000000001030307) Oct 14 05:28:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44116 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=548636177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFAB9A0000000001030307) Oct 14 05:28:59 localhost python3.9[147645]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:29:00 localhost sshd[147647]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17043 DF PROTO=TCP SPT=43318 DPT=9102 SEQ=3552275696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFB09A0000000001030307) Oct 14 05:29:00 localhost sshd[147695]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44117 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=548636177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFB39A0000000001030307) Oct 14 05:29:01 localhost python3.9[147742]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:02 localhost python3.9[147834]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:03 localhost python3.9[147926]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:03 localhost python3.9[148016]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:29:04 localhost python3.9[148106]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:29:05 localhost python3.9[148198]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:29:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44118 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=548636177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFC35B0000000001030307) Oct 14 05:29:05 localhost systemd-logind[760]: Session 47 logged out. Waiting for processes to exit. Oct 14 05:29:05 localhost systemd[1]: session-47.scope: Deactivated successfully. Oct 14 05:29:05 localhost systemd[1]: session-47.scope: Consumed 9.077s CPU time. Oct 14 05:29:05 localhost systemd-logind[760]: Removed session 47. Oct 14 05:29:08 localhost sshd[148214]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42056 DF PROTO=TCP SPT=40332 DPT=9105 SEQ=3391696259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFD06E0000000001030307) Oct 14 05:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42057 DF PROTO=TCP SPT=40332 DPT=9105 SEQ=3391696259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFD45A0000000001030307) Oct 14 05:29:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6711 DF PROTO=TCP SPT=36082 DPT=9100 SEQ=4245352943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFD6560000000001030307) Oct 14 05:29:10 localhost sshd[148216]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59478 DF PROTO=TCP SPT=38062 DPT=9101 SEQ=1013376436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFD9390000000001030307) Oct 14 05:29:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6712 DF PROTO=TCP SPT=36082 DPT=9100 SEQ=4245352943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFDA5B0000000001030307) Oct 14 05:29:11 localhost sshd[148218]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:11 localhost systemd-logind[760]: New session 48 of user zuul. Oct 14 05:29:11 localhost systemd[1]: Started Session 48 of User zuul. Oct 14 05:29:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42058 DF PROTO=TCP SPT=40332 DPT=9105 SEQ=3391696259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFDC5B0000000001030307) Oct 14 05:29:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59479 DF PROTO=TCP SPT=38062 DPT=9101 SEQ=1013376436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFDD5B0000000001030307) Oct 14 05:29:12 localhost python3.9[148311]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:29:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6713 DF PROTO=TCP SPT=36082 DPT=9100 SEQ=4245352943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFE25B0000000001030307) Oct 14 05:29:15 localhost python3.9[148407]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42059 DF PROTO=TCP SPT=40332 DPT=9105 SEQ=3391696259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFEC1A0000000001030307) Oct 14 05:29:16 localhost python3.9[148499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:16 localhost python3.9[148572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434155.3436897-184-141177977558474/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:17 localhost python3.9[148664]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:18 localhost python3.9[148757]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:18 localhost sshd[148758]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59481 DF PROTO=TCP SPT=38062 DPT=9101 SEQ=1013376436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2AFFF51B0000000001030307) Oct 14 05:29:18 localhost python3.9[148832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434157.6095312-253-99415169291005/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:19 localhost python3.9[148924]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:20 localhost python3.9[149016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:20 localhost python3.9[149089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434159.5179033-325-171030878402983/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:20 localhost sshd[149104]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:21 localhost python3.9[149183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:21 localhost sshd[149198]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:21 localhost python3.9[149277]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:22 localhost python3.9[149350]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434161.4380217-397-232074671168455/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:23 localhost python3.9[149442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11192 DF PROTO=TCP SPT=40594 DPT=9102 SEQ=2202394417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0009F40000000001030307) Oct 14 05:29:23 localhost python3.9[149534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:24 localhost python3.9[149607]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434163.231975-468-90105857283716/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11193 DF PROTO=TCP SPT=40594 DPT=9102 SEQ=2202394417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B000E1A0000000001030307) Oct 14 05:29:25 localhost python3.9[149699]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:25 localhost python3.9[149791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:26 localhost python3.9[149864]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434165.235077-540-45624600959462/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:27 localhost python3.9[149956]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:27 localhost python3.9[150048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6074 DF PROTO=TCP SPT=59938 DPT=9882 SEQ=4169186021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B001CBB0000000001030307) Oct 14 05:29:28 localhost python3.9[150121]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434167.1692476-613-47189290359895/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:29 localhost python3.9[150213]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:29 localhost python3.9[150305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:30 localhost python3.9[150378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434169.2396655-687-123648208944664/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2c0c9af0a7c9617e778807fbf142c88d84b85267 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11195 DF PROTO=TCP SPT=40594 DPT=9102 SEQ=2202394417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0025DA0000000001030307) Oct 14 05:29:30 localhost systemd[1]: session-48.scope: Deactivated successfully. Oct 14 05:29:30 localhost systemd[1]: session-48.scope: Consumed 11.850s CPU time. Oct 14 05:29:30 localhost systemd-logind[760]: Session 48 logged out. Waiting for processes to exit. Oct 14 05:29:30 localhost systemd-logind[760]: Removed session 48. Oct 14 05:29:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6077 DF PROTO=TCP SPT=59938 DPT=9882 SEQ=4169186021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00389B0000000001030307) Oct 14 05:29:36 localhost sshd[150393]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:36 localhost systemd-logind[760]: New session 49 of user zuul. Oct 14 05:29:36 localhost systemd[1]: Started Session 49 of User zuul. Oct 14 05:29:37 localhost python3.9[150488]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:38 localhost python3.9[150580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64515 DF PROTO=TCP SPT=60318 DPT=9105 SEQ=2673955938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00459E0000000001030307) Oct 14 05:29:39 localhost python3.9[150653]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434177.6093411-64-212940730049779/.source.conf _original_basename=ceph.conf follow=False checksum=3ea08ebaa38e66fdc9487ab3279546d8d5630636 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:39 localhost python3.9[150745]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:29:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64516 DF PROTO=TCP SPT=60318 DPT=9105 SEQ=2673955938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00499B0000000001030307) Oct 14 05:29:40 localhost python3.9[150818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434179.1576748-64-257439226101558/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=0991400062f1e3522feec6859340320816889889 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:29:40 localhost systemd[1]: session-49.scope: Deactivated successfully. Oct 14 05:29:40 localhost systemd[1]: session-49.scope: Consumed 2.419s CPU time. Oct 14 05:29:40 localhost systemd-logind[760]: Session 49 logged out. Waiting for processes to exit. Oct 14 05:29:40 localhost systemd-logind[760]: Removed session 49. Oct 14 05:29:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64517 DF PROTO=TCP SPT=60318 DPT=9105 SEQ=2673955938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00519B0000000001030307) Oct 14 05:29:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64518 DF PROTO=TCP SPT=60318 DPT=9105 SEQ=2673955938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00615A0000000001030307) Oct 14 05:29:46 localhost sshd[150833]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:29:46 localhost systemd-logind[760]: New session 50 of user zuul. Oct 14 05:29:46 localhost systemd[1]: Started Session 50 of User zuul. Oct 14 05:29:47 localhost python3.9[150926]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:29:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39652 DF PROTO=TCP SPT=53538 DPT=9101 SEQ=3895317122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B006A1A0000000001030307) Oct 14 05:29:48 localhost python3.9[151022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:49 localhost python3.9[151114]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:29:50 localhost python3.9[151234]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:29:51 localhost python3.9[151370]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Oct 14 05:29:52 localhost python3.9[151464]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:29:53 localhost python3.9[151518]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:29:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51992 DF PROTO=TCP SPT=58698 DPT=9102 SEQ=1024187651 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B007F250000000001030307) Oct 14 05:29:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51993 DF PROTO=TCP SPT=58698 DPT=9102 SEQ=1024187651 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00831B0000000001030307) Oct 14 05:29:57 localhost python3.9[151612]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:29:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37754 DF PROTO=TCP SPT=51840 DPT=9882 SEQ=2151715631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0091EB0000000001030307) Oct 14 05:29:58 localhost python3[151707]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Oct 14 05:29:59 localhost python3.9[151799]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:00 localhost python3.9[151891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51995 DF PROTO=TCP SPT=58698 DPT=9102 SEQ=1024187651 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B009ADA0000000001030307) Oct 14 05:30:00 localhost python3.9[151939]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:01 localhost python3.9[152031]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:01 localhost python3.9[152079]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fh23dlx5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:02 localhost python3.9[152171]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:03 localhost python3.9[152219]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:04 localhost python3.9[152311]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37757 DF PROTO=TCP SPT=51840 DPT=9882 SEQ=2151715631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00AD9A0000000001030307) Oct 14 05:30:05 localhost python3[152404]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Oct 14 05:30:06 localhost python3.9[152496]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:07 localhost python3.9[152571]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434205.6498518-433-46964684810262/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2438 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=3430914053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00BACF0000000001030307) Oct 14 05:30:08 localhost python3.9[152664]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:09 localhost python3.9[152739]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434208.1470606-478-29288655920294/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2439 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=3430914053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00BEDA0000000001030307) Oct 14 05:30:10 localhost python3.9[152832]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:10 localhost python3.9[152907]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434209.6703727-524-224198029549994/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:11 localhost python3.9[152999]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2440 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=3430914053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00C6DA0000000001030307) Oct 14 05:30:12 localhost python3.9[153074]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434210.9185143-568-149446942983304/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:12 localhost python3.9[153166]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:13 localhost python3.9[153241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434212.2622612-614-172371613423383/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:14 localhost python3.9[153333]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:14 localhost python3.9[153425]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2441 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=3430914053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00D69A0000000001030307) Oct 14 05:30:15 localhost python3.9[153520]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:16 localhost python3.9[153612]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:17 localhost python3.9[153705]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:30:18 localhost python3.9[153799]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11141 DF PROTO=TCP SPT=36340 DPT=9101 SEQ=27226405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00DF5A0000000001030307) Oct 14 05:30:18 localhost python3.9[153894]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:19 localhost python3.9[153984]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:30:21 localhost python3.9[154077]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005486733.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:2c:0c:de:0a" external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:21 localhost ovs-vsctl[154078]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005486733.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:2c:0c:de:0a external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Oct 14 05:30:22 localhost python3.9[154170]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:23 localhost python3.9[154263]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:30:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34556 DF PROTO=TCP SPT=56438 DPT=9102 SEQ=493682149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00F4550000000001030307) Oct 14 05:30:23 localhost python3.9[154357]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:30:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34557 DF PROTO=TCP SPT=56438 DPT=9102 SEQ=493682149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B00F85A0000000001030307) Oct 14 05:30:24 localhost python3.9[154449]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:25 localhost python3.9[154497]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:30:25 localhost python3.9[154589]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:26 localhost python3.9[154637]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:30:26 localhost python3.9[154729]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:27 localhost python3.9[154821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:28 localhost python3.9[154869]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55557 DF PROTO=TCP SPT=44532 DPT=9882 SEQ=3442776617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01071A0000000001030307) Oct 14 05:30:28 localhost python3.9[154961]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:29 localhost python3.9[155009]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:30 localhost python3.9[155101]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:30:30 localhost systemd[1]: Reloading. Oct 14 05:30:30 localhost systemd-rc-local-generator[155126]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:30:30 localhost systemd-sysv-generator[155129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:30:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:30:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34559 DF PROTO=TCP SPT=56438 DPT=9102 SEQ=493682149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01101B0000000001030307) Oct 14 05:30:31 localhost python3.9[155231]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:31 localhost python3.9[155279]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:32 localhost python3.9[155371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:32 localhost python3.9[155419]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:33 localhost python3.9[155511]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:30:33 localhost systemd[1]: Reloading. Oct 14 05:30:33 localhost systemd-rc-local-generator[155535]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:30:33 localhost systemd-sysv-generator[155540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:30:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:30:33 localhost systemd[1]: Starting Create netns directory... Oct 14 05:30:33 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:30:33 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:30:33 localhost systemd[1]: Finished Create netns directory. Oct 14 05:30:34 localhost python3.9[155646]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:30:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55560 DF PROTO=TCP SPT=44532 DPT=9882 SEQ=3442776617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0122DA0000000001030307) Oct 14 05:30:35 localhost python3.9[155738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:35 localhost python3.9[155811]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434234.9887407-1345-186006998840313/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:30:36 localhost python3.9[155903]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:30:37 localhost python3.9[155995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:30:38 localhost python3.9[156070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434237.2308629-1420-186345290959878/.source.json _original_basename=.e41jpom7 follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19439 DF PROTO=TCP SPT=49516 DPT=9105 SEQ=2816354378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B012FFF0000000001030307) Oct 14 05:30:39 localhost python3.9[156162]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19440 DF PROTO=TCP SPT=49516 DPT=9105 SEQ=2816354378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01341A0000000001030307) Oct 14 05:30:41 localhost python3.9[156419]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Oct 14 05:30:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19441 DF PROTO=TCP SPT=49516 DPT=9105 SEQ=2816354378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B013C1A0000000001030307) Oct 14 05:30:42 localhost python3.9[156511]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:30:43 localhost python3.9[156603]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:30:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19442 DF PROTO=TCP SPT=49516 DPT=9105 SEQ=2816354378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B014BDA0000000001030307) Oct 14 05:30:47 localhost python3[156722]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:30:47 localhost python3[156722]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8808c3fcdd35e5a4eacb6d3f5ed89688361f4338056395008c191e57b6afaf7d",#012 "Digest": "sha256:31464fe4defe28fe4896a946cfe50ee0b001d1a03081174d9f69e4a313b0f21e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:31464fe4defe28fe4896a946cfe50ee0b001d1a03081174d9f69e4a313b0f21e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-13T13:00:39.999290816Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 345598922,#012 "VirtualSize": 345598922,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",#012 "sha256:941d6c62fda0ad5502f66ca2e71ffe6e3f64b2a5a0db75dac0075fa750a883f2",#012 "sha256:a82e45bff332403f46d24749948c917d1a37ea0b8ab922688da4f6038dc99c66"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-13T12:28:42.843286399Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843354051Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843394192Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843417133Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843442193Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843461914Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:43.236856724Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:29:17.539596691Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:29:21.007092512Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util- Oct 14 05:30:47 localhost podman[156772]: 2025-10-14 09:30:47.644801925 +0000 UTC m=+0.086483719 container remove a611fa24fd97a6cdc6d208d6d39bcbc5f05f1b08c85b4529bb13a8579732f0f5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true) Oct 14 05:30:47 localhost python3[156722]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Oct 14 05:30:47 localhost podman[156785]: Oct 14 05:30:47 localhost podman[156785]: 2025-10-14 09:30:47.728768575 +0000 UTC m=+0.067631366 container create 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 05:30:47 localhost podman[156785]: 2025-10-14 09:30:47.693873374 +0000 UTC m=+0.032736155 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Oct 14 05:30:47 localhost python3[156722]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Oct 14 05:30:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28777 DF PROTO=TCP SPT=46968 DPT=9101 SEQ=922501728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01549A0000000001030307) Oct 14 05:30:48 localhost python3.9[156912]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:30:49 localhost python3.9[157006]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:49 localhost python3.9[157052]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:30:50 localhost python3.9[157143]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434250.038011-1684-152643352128227/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:30:51 localhost python3.9[157189]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:30:51 localhost systemd[1]: Reloading. Oct 14 05:30:51 localhost systemd-rc-local-generator[157239]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:30:51 localhost systemd-sysv-generator[157247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:30:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:30:52 localhost python3.9[157331]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:30:52 localhost systemd[1]: Reloading. Oct 14 05:30:52 localhost systemd-rc-local-generator[157362]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:30:52 localhost systemd-sysv-generator[157366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:30:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:30:52 localhost systemd[1]: Starting ovn_controller container... Oct 14 05:30:52 localhost systemd[1]: Started libcrun container. Oct 14 05:30:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae8de889a6f1b6044dbec6d8cd0c34680358f170b5f1158854e1afc0535f3c1e/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Oct 14 05:30:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:30:52 localhost podman[157374]: 2025-10-14 09:30:52.604907811 +0000 UTC m=+0.142778062 container init 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3) Oct 14 05:30:52 localhost ovn_controller[157396]: + sudo -E kolla_set_configs Oct 14 05:30:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:30:52 localhost podman[157374]: 2025-10-14 09:30:52.635170898 +0000 UTC m=+0.173041149 container start 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:30:52 localhost edpm-start-podman-container[157374]: ovn_controller Oct 14 05:30:52 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 05:30:52 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 05:30:52 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 05:30:52 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 05:30:52 localhost podman[157411]: 2025-10-14 09:30:52.709874882 +0000 UTC m=+0.067810822 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Oct 14 05:30:52 localhost systemd[157430]: Queued start job for default target Main User Target. Oct 14 05:30:52 localhost systemd[157430]: Created slice User Application Slice. Oct 14 05:30:52 localhost systemd[157430]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 05:30:52 localhost systemd[157430]: Started Daily Cleanup of User's Temporary Directories. Oct 14 05:30:52 localhost systemd[157430]: Reached target Paths. Oct 14 05:30:52 localhost systemd[157430]: Reached target Timers. Oct 14 05:30:52 localhost systemd[157430]: Starting D-Bus User Message Bus Socket... Oct 14 05:30:52 localhost systemd[157430]: Starting Create User's Volatile Files and Directories... Oct 14 05:30:52 localhost systemd[157430]: Finished Create User's Volatile Files and Directories. Oct 14 05:30:52 localhost systemd[157430]: Listening on D-Bus User Message Bus Socket. Oct 14 05:30:52 localhost systemd[157430]: Reached target Sockets. Oct 14 05:30:52 localhost systemd[157430]: Reached target Basic System. Oct 14 05:30:52 localhost systemd[1]: Started User Manager for UID 0. Oct 14 05:30:52 localhost systemd[157430]: Reached target Main User Target. Oct 14 05:30:52 localhost systemd[157430]: Startup finished in 102ms. Oct 14 05:30:52 localhost podman[157411]: 2025-10-14 09:30:52.798269499 +0000 UTC m=+0.156205439 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:30:52 localhost systemd[1]: Started Session c12 of User root. Oct 14 05:30:52 localhost podman[157411]: unhealthy Oct 14 05:30:52 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:30:52 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Failed with result 'exit-code'. Oct 14 05:30:52 localhost edpm-start-podman-container[157373]: Creating additional drop-in dependency for "ovn_controller" (1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f) Oct 14 05:30:52 localhost systemd[1]: Reloading. Oct 14 05:30:52 localhost ovn_controller[157396]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:30:52 localhost ovn_controller[157396]: INFO:__main__:Validating config file Oct 14 05:30:52 localhost ovn_controller[157396]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:30:52 localhost ovn_controller[157396]: INFO:__main__:Writing out command to execute Oct 14 05:30:52 localhost ovn_controller[157396]: ++ cat /run_command Oct 14 05:30:52 localhost ovn_controller[157396]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Oct 14 05:30:52 localhost ovn_controller[157396]: + ARGS= Oct 14 05:30:52 localhost ovn_controller[157396]: + sudo kolla_copy_cacerts Oct 14 05:30:52 localhost systemd-rc-local-generator[157487]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:30:52 localhost systemd-sysv-generator[157490]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:30:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:30:53 localhost systemd[1]: session-c12.scope: Deactivated successfully. Oct 14 05:30:53 localhost systemd[1]: Started ovn_controller container. Oct 14 05:30:53 localhost systemd[1]: Started Session c13 of User root. Oct 14 05:30:53 localhost systemd[1]: session-c13.scope: Deactivated successfully. Oct 14 05:30:53 localhost ovn_controller[157396]: + [[ ! -n '' ]] Oct 14 05:30:53 localhost ovn_controller[157396]: + . kolla_extend_start Oct 14 05:30:53 localhost ovn_controller[157396]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Oct 14 05:30:53 localhost ovn_controller[157396]: + umask 0022 Oct 14 05:30:53 localhost ovn_controller[157396]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Oct 14 05:30:53 localhost ovn_controller[157396]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8] Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00004|main|INFO|OVS IDL reconnected, force recompute. Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00013|main|INFO|OVS feature set changed, force recompute. Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute. Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00021|main|INFO|OVS feature set changed, force recompute. Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-31b4da-0 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-4e3575-0 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-953af5-0 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00026|binding|INFO|Claiming lport 3ec9b060-f43d-4698-9c76-6062c70911d5 for this chassis. Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00027|binding|INFO|3ec9b060-f43d-4698-9c76-6062c70911d5: Claiming fa:16:3e:84:5e:e5 192.168.0.46 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00028|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00029|binding|INFO|Removing lport 3ec9b060-f43d-4698-9c76-6062c70911d5 ovn-installed in OVS Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00030|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-31b4da-0 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-4e3575-0 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00033|ovn_bfd|INFO|Enabled BFD on interface ovn-953af5-0 Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00034|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00035|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00036|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00037|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost ovn_controller[157396]: 2025-10-14T09:30:53Z|00038|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43046 DF PROTO=TCP SPT=46378 DPT=9102 SEQ=2384661746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0169840000000001030307) Oct 14 05:30:53 localhost python3.9[157605]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:53 localhost ovs-vsctl[157606]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Oct 14 05:30:54 localhost ovn_controller[157396]: 2025-10-14T09:30:54Z|00039|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43047 DF PROTO=TCP SPT=46378 DPT=9102 SEQ=2384661746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B016D9A0000000001030307) Oct 14 05:30:54 localhost python3.9[157698]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:54 localhost ovs-vsctl[157700]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Oct 14 05:30:55 localhost ovn_controller[157396]: 2025-10-14T09:30:55Z|00040|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:55 localhost ovn_controller[157396]: 2025-10-14T09:30:55Z|00041|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:30:55 localhost python3.9[157793]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:30:55 localhost ovs-vsctl[157794]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Oct 14 05:30:56 localhost systemd[1]: session-50.scope: Deactivated successfully. Oct 14 05:30:56 localhost systemd-logind[760]: Session 50 logged out. Waiting for processes to exit. Oct 14 05:30:56 localhost systemd[1]: session-50.scope: Consumed 41.709s CPU time. Oct 14 05:30:56 localhost systemd-logind[760]: Removed session 50. Oct 14 05:30:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31850 DF PROTO=TCP SPT=39094 DPT=9882 SEQ=2771121644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B017C4B0000000001030307) Oct 14 05:31:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43049 DF PROTO=TCP SPT=46378 DPT=9102 SEQ=2384661746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01855A0000000001030307) Oct 14 05:31:01 localhost ovn_controller[157396]: 2025-10-14T09:31:01Z|00042|binding|INFO|Setting lport 3ec9b060-f43d-4698-9c76-6062c70911d5 ovn-installed in OVS Oct 14 05:31:01 localhost ovn_controller[157396]: 2025-10-14T09:31:01Z|00043|binding|INFO|Setting lport 3ec9b060-f43d-4698-9c76-6062c70911d5 up in Southbound Oct 14 05:31:02 localhost sshd[157809]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:31:02 localhost systemd-logind[760]: New session 52 of user zuul. Oct 14 05:31:02 localhost systemd[1]: Started Session 52 of User zuul. Oct 14 05:31:03 localhost python3.9[157902]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:31:03 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 05:31:03 localhost systemd[157430]: Activating special unit Exit the Session... Oct 14 05:31:03 localhost systemd[157430]: Stopped target Main User Target. Oct 14 05:31:03 localhost systemd[157430]: Stopped target Basic System. Oct 14 05:31:03 localhost systemd[157430]: Stopped target Paths. Oct 14 05:31:03 localhost systemd[157430]: Stopped target Sockets. Oct 14 05:31:03 localhost systemd[157430]: Stopped target Timers. Oct 14 05:31:03 localhost systemd[157430]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 05:31:03 localhost systemd[157430]: Closed D-Bus User Message Bus Socket. Oct 14 05:31:03 localhost systemd[157430]: Stopped Create User's Volatile Files and Directories. Oct 14 05:31:03 localhost systemd[157430]: Removed slice User Application Slice. Oct 14 05:31:03 localhost systemd[157430]: Reached target Shutdown. Oct 14 05:31:03 localhost systemd[157430]: Finished Exit the Session. Oct 14 05:31:03 localhost systemd[157430]: Reached target Exit the Session. Oct 14 05:31:03 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 05:31:03 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 05:31:03 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 05:31:03 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 05:31:03 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 05:31:03 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 05:31:03 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 05:31:04 localhost python3.9[158001]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:05 localhost python3.9[158093]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31853 DF PROTO=TCP SPT=39094 DPT=9882 SEQ=2771121644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01981A0000000001030307) Oct 14 05:31:05 localhost python3.9[158185]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:06 localhost python3.9[158277]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:07 localhost python3.9[158369]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:07 localhost python3.9[158459]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:31:08 localhost python3.9[158551]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Oct 14 05:31:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52403 DF PROTO=TCP SPT=56716 DPT=9105 SEQ=2504539774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01A52E0000000001030307) Oct 14 05:31:09 localhost python3.9[158641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52404 DF PROTO=TCP SPT=56716 DPT=9105 SEQ=2504539774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01A91B0000000001030307) Oct 14 05:31:10 localhost python3.9[158714]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434269.0849354-220-194315281624689/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:11 localhost python3.9[158804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:11 localhost python3.9[158878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434270.6660576-265-130808595045735/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52405 DF PROTO=TCP SPT=56716 DPT=9105 SEQ=2504539774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01B11A0000000001030307) Oct 14 05:31:12 localhost python3.9[158970]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:31:13 localhost python3.9[159024]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:31:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52406 DF PROTO=TCP SPT=56716 DPT=9105 SEQ=2504539774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01C0DA0000000001030307) Oct 14 05:31:17 localhost python3.9[159118]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:31:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34318 DF PROTO=TCP SPT=45528 DPT=9101 SEQ=824576425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01C9DA0000000001030307) Oct 14 05:31:18 localhost ovn_controller[157396]: 2025-10-14T09:31:18Z|00044|memory|INFO|18952 kB peak resident set size after 25.0 seconds Oct 14 05:31:18 localhost ovn_controller[157396]: 2025-10-14T09:31:18Z|00045|memory|INFO|idl-cells-OVN_Southbound:3978 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:288 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:153 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:66 Oct 14 05:31:18 localhost python3.9[159211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:19 localhost python3.9[159282]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434278.3052573-376-157114008955713/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:19 localhost python3.9[159372]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:20 localhost python3.9[159443]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434279.497956-376-120279793588870/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:22 localhost python3.9[159533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:22 localhost python3.9[159604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434281.617631-508-140637942932595/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:23 localhost python3.9[159694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=747 DF PROTO=TCP SPT=42850 DPT=9102 SEQ=4290030590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01DEB50000000001030307) Oct 14 05:31:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:31:23 localhost python3.9[159765]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434282.8009923-508-243338336882876/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:23 localhost systemd[1]: tmp-crun.d0RKKk.mount: Deactivated successfully. Oct 14 05:31:23 localhost podman[159766]: 2025-10-14 09:31:23.775796829 +0000 UTC m=+0.109303746 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0) Oct 14 05:31:23 localhost podman[159766]: 2025-10-14 09:31:23.810054169 +0000 UTC m=+0.143561016 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 05:31:23 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:31:24 localhost python3.9[159879]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:31:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=748 DF PROTO=TCP SPT=42850 DPT=9102 SEQ=4290030590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01E2DA0000000001030307) Oct 14 05:31:25 localhost python3.9[159973]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:26 localhost python3.9[160065]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:26 localhost python3.9[160113]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:27 localhost python3.9[160205]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:27 localhost python3.9[160253]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48601 DF PROTO=TCP SPT=51084 DPT=9882 SEQ=1731472783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01F17B0000000001030307) Oct 14 05:31:28 localhost python3.9[160345]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:29 localhost python3.9[160437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:29 localhost python3.9[160485]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:30 localhost python3.9[160577]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=750 DF PROTO=TCP SPT=42850 DPT=9102 SEQ=4290030590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B01FA9A0000000001030307) Oct 14 05:31:30 localhost python3.9[160625]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:31 localhost ovn_controller[157396]: 2025-10-14T09:31:31Z|00046|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Oct 14 05:31:31 localhost python3.9[160717]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:31:31 localhost systemd[1]: Reloading. Oct 14 05:31:31 localhost systemd-sysv-generator[160743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:31:31 localhost systemd-rc-local-generator[160737]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:31:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:31:33 localhost python3.9[160847]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:33 localhost python3.9[160895]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:34 localhost python3.9[160987]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:35 localhost python3.9[161035]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48604 DF PROTO=TCP SPT=51084 DPT=9882 SEQ=1731472783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B020D5A0000000001030307) Oct 14 05:31:36 localhost python3.9[161127]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:31:36 localhost systemd[1]: Reloading. Oct 14 05:31:36 localhost systemd-rc-local-generator[161144]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:31:36 localhost systemd-sysv-generator[161150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:31:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:31:36 localhost systemd[1]: Starting Create netns directory... Oct 14 05:31:36 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:31:36 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:31:36 localhost systemd[1]: Finished Create netns directory. Oct 14 05:31:37 localhost python3.9[161261]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:37 localhost python3.9[161353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:38 localhost python3.9[161426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434297.4523056-962-153181467197613/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34191 DF PROTO=TCP SPT=56534 DPT=9105 SEQ=2974494904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B021A5E0000000001030307) Oct 14 05:31:39 localhost python3.9[161518]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:31:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34192 DF PROTO=TCP SPT=56534 DPT=9105 SEQ=2974494904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B021E5A0000000001030307) Oct 14 05:31:40 localhost python3.9[161610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:31:41 localhost python3.9[161685]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434300.0144196-1036-67518345884634/.source.json _original_basename=.edg909_r follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34193 DF PROTO=TCP SPT=56534 DPT=9105 SEQ=2974494904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02265A0000000001030307) Oct 14 05:31:41 localhost python3.9[161777]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:44 localhost python3.9[162034]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Oct 14 05:31:45 localhost python3.9[162126]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:31:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34194 DF PROTO=TCP SPT=56534 DPT=9105 SEQ=2974494904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02361A0000000001030307) Oct 14 05:31:46 localhost python3.9[162218]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:31:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64784 DF PROTO=TCP SPT=52434 DPT=9101 SEQ=2757707756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B023EDA0000000001030307) Oct 14 05:31:50 localhost python3[162337]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:31:50 localhost python3[162337]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "c6d1b3e4cccd28b7c818995b8e8c01f80bc6d31844f018079ac974a1bc7ff587",#012 "Digest": "sha256:cc78c4a7fbd7c7348d3ee41420dd7c42d83eb1e76a8db6bb94a538a5d2f2c424",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:cc78c4a7fbd7c7348d3ee41420dd7c42d83eb1e76a8db6bb94a538a5d2f2c424"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-13T12:47:50.032440747Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 783982852,#012 "VirtualSize": 783982852,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41/diff:/var/lib/containers/storage/overlay/a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",#012 "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",#012 "sha256:921303cda5c9d8779e6603d3888ac24385c443b872bec9c3138835df3416e3df",#012 "sha256:c059b89efb40f3097e4f1e24153e4ed15b8a660accccb7f6b341c8900767b90e",#012 "sha256:e4b986e48b4f8d2e3d4ecc6d2e17b8ac252dfafd4e4fec6074bd29e67b374a2f"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-13T12:28:42.843286399Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843354051Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843394192Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843417133Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843442193Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843461914Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:43.236856724Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:29:17.539596691Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.con Oct 14 05:31:50 localhost podman[162390]: 2025-10-14 09:31:50.649606916 +0000 UTC m=+0.093611789 container remove 7e9d1179a8f4f94e3276942fbe64369e67e966d31c190dc6fb0e5d6cf2cea97e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0a131c335ed9f542ed2a9fb22aa1dfa8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Oct 14 05:31:50 localhost python3[162337]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Oct 14 05:31:50 localhost podman[162404]: Oct 14 05:31:50 localhost podman[162404]: 2025-10-14 09:31:50.771229392 +0000 UTC m=+0.100418930 container create 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 05:31:50 localhost podman[162404]: 2025-10-14 09:31:50.719966055 +0000 UTC m=+0.049155633 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Oct 14 05:31:50 localhost python3[162337]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Oct 14 05:31:51 localhost python3.9[162533]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:31:52 localhost python3.9[162627]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:52 localhost python3.9[162703]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:31:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49340 DF PROTO=TCP SPT=43408 DPT=9102 SEQ=2406185996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0253E40000000001030307) Oct 14 05:31:53 localhost python3.9[162826]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434313.019322-1300-53562919769255/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:31:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:31:54 localhost systemd[1]: tmp-crun.wmacV6.mount: Deactivated successfully. Oct 14 05:31:54 localhost podman[162873]: 2025-10-14 09:31:54.181471765 +0000 UTC m=+0.134007471 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:31:54 localhost podman[162873]: 2025-10-14 09:31:54.248012885 +0000 UTC m=+0.200548581 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 05:31:54 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:31:54 localhost python3.9[162872]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:31:54 localhost systemd[1]: Reloading. Oct 14 05:31:54 localhost systemd-rc-local-generator[162936]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:31:54 localhost systemd-sysv-generator[162941]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:31:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49341 DF PROTO=TCP SPT=43408 DPT=9102 SEQ=2406185996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0257DB0000000001030307) Oct 14 05:31:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:31:55 localhost python3.9[162994]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:31:55 localhost systemd[1]: Reloading. Oct 14 05:31:55 localhost systemd-rc-local-generator[163019]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:31:55 localhost systemd-sysv-generator[163025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:31:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:31:55 localhost systemd[1]: Starting ovn_metadata_agent container... Oct 14 05:31:55 localhost systemd[1]: Started libcrun container. Oct 14 05:31:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac9b811e287a27c8ef5c68d48bd0adf4260cb499bb8096b1300c9e83115bd948/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Oct 14 05:31:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac9b811e287a27c8ef5c68d48bd0adf4260cb499bb8096b1300c9e83115bd948/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:31:55 localhost podman[163036]: 2025-10-14 09:31:55.903903187 +0000 UTC m=+0.135411025 container init 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:31:55 localhost ovn_metadata_agent[163050]: + sudo -E kolla_set_configs Oct 14 05:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:31:55 localhost podman[163036]: 2025-10-14 09:31:55.944792212 +0000 UTC m=+0.176300020 container start 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:31:55 localhost edpm-start-podman-container[163036]: ovn_metadata_agent Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Validating config file Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Copying service configuration files Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Writing out command to execute Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/external Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23 Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: ++ cat /run_command Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + CMD=neutron-ovn-metadata-agent Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + ARGS= Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + sudo kolla_copy_cacerts Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: Running command: 'neutron-ovn-metadata-agent' Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + [[ ! -n '' ]] Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + . kolla_extend_start Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + umask 0022 Oct 14 05:31:56 localhost ovn_metadata_agent[163050]: + exec neutron-ovn-metadata-agent Oct 14 05:31:56 localhost podman[163057]: 2025-10-14 09:31:56.054897032 +0000 UTC m=+0.103724843 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Oct 14 05:31:56 localhost edpm-start-podman-container[163035]: Creating additional drop-in dependency for "ovn_metadata_agent" (28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857) Oct 14 05:31:56 localhost podman[163057]: 2025-10-14 09:31:56.088242184 +0000 UTC m=+0.137069995 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:31:56 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:31:56 localhost systemd[1]: Reloading. Oct 14 05:31:56 localhost systemd-sysv-generator[163128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:31:56 localhost systemd-rc-local-generator[163124]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:31:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:31:56 localhost systemd[1]: Started ovn_metadata_agent container. Oct 14 05:31:56 localhost systemd[1]: session-52.scope: Deactivated successfully. Oct 14 05:31:56 localhost systemd[1]: session-52.scope: Consumed 32.884s CPU time. Oct 14 05:31:56 localhost systemd-logind[760]: Session 52 logged out. Waiting for processes to exit. Oct 14 05:31:56 localhost systemd-logind[760]: Removed session 52. Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.701 163055 INFO neutron.common.config [-] Logging enabled!#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.701 163055 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.701 163055 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.701 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.702 163055 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.703 163055 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.704 163055 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.705 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.706 163055 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.707 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.708 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.709 163055 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.710 163055 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.711 163055 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.712 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.713 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.714 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.715 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.716 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.717 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.718 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.719 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.720 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.721 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.722 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.723 163055 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.724 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.725 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.726 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.727 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.728 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.729 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.730 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.731 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.732 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.733 163055 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.799 163055 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.799 163055 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.799 163055 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.800 163055 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.800 163055 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.817 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 9e4b0f79-1220-4c7d-a18d-fa1a88dab362 (UUID: 9e4b0f79-1220-4c7d-a18d-fa1a88dab362) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.833 163055 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.833 163055 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.834 163055 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.834 163055 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.836 163055 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.839 163055 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.847 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:5e:e5 192.168.0.46'], port_security=['fa:16:3e:84:5e:e5 192.168.0.46'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.46/24', 'neutron:device_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005486733.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '6', 'neutron:security_group_ids': '313d605c-14d3-4f16-b913-a4f55afa256e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d31a249-7ee5-4da6-a9d1-dab19bbf097c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3ec9b060-f43d-4698-9c76-6062c70911d5) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.848 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '9e4b0f79-1220-4c7d-a18d-fa1a88dab362'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': '3f69ebf8-22bb-5c58-9aa2-18a03394eccc', 'neutron:ovn-metadata-sb-cfg': '1'}, name=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, nb_cfg_timestamp=1760434261254, nb_cfg=3) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.849 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3ec9b060-f43d-4698-9c76-6062c70911d5 in datapath 7d0cd696-bdd7-4e70-9512-eb0d23640314 bound to our chassis on insert#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.849 163055 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.850 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.850 163055 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.850 163055 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.850 163055 INFO oslo_service.service [-] Starting 1 workers#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.853 163055 DEBUG oslo_service.service [-] Started child 163154 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.856 163055 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d0cd696-bdd7-4e70-9512-eb0d23640314#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.857 163154 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-192465'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.858 163055 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp1tuqbpbs/privsep.sock']#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.876 163154 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.877 163154 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.877 163154 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.879 163154 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.880 163154 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Oct 14 05:31:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:57.888 163154 INFO eventlet.wsgi.server [-] (163154) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Oct 14 05:31:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=824 DF PROTO=TCP SPT=59410 DPT=9882 SEQ=3810009322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0266AB0000000001030307) Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.489 163055 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.489 163055 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1tuqbpbs/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.367 163159 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.372 163159 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.373 163159 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.374 163159 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163159#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.492 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[a7235f07-bee5-4323-af4f-d4578336682e]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.914 163159 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.915 163159 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:31:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:58.915 163159 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:31:59 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:59.412 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[60001a3f-b897-4e0e-8ae0-4eb4cb0e20dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:31:59 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:59.414 163055 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpazbvjthn/privsep.sock']#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:00.035 163055 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:00.036 163055 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpazbvjthn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:59.939 163170 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:59.943 163170 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:59.945 163170 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:31:59.945 163170 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163170#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:00.040 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[6069d0ea-1643-4807-93a3-4c7efed9a247]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:00.577 163170 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:00.577 163170 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:32:00 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:00.577 163170 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:32:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49343 DF PROTO=TCP SPT=43408 DPT=9102 SEQ=2406185996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B026F9B0000000001030307) Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.079 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[2c1fc90e-48db-46b2-a338-2d4a8283f7c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.082 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0be4c8-1bef-4a84-851a-fad4ebf28ead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.105 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[3398bc45-fd0c-41e1-bb3a-c1b8445b8554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.126 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[48b47fed-af73-4859-9dcf-742aa77b9e51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d0cd696-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:7e:3c:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705865, 'reachable_time': 37779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 163180, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.146 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[442cd330-ded2-49c2-8ff5-4683af6344ca]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d0cd696-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705872, 'tstamp': 705872}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163181, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap7d0cd696-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705876, 'tstamp': 705876}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163181, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705875, 'tstamp': 705875}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163181, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:3c60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705865, 'tstamp': 705865}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163181, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.205 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[d82c62dc-82bc-4512-af9d-ac1b687b4cd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.208 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d0cd696-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.213 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d0cd696-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.214 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.215 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d0cd696-b0, col_values=(('external_ids', {'iface-id': '25c6586a-239c-451b-aac2-e0a3ee5c3145'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.215 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.220 163055 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpgenh15rs/privsep.sock']#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.820 163055 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.821 163055 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgenh15rs/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.729 163190 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.733 163190 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.736 163190 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.736 163190 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163190#033[00m Oct 14 05:32:01 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:01.823 163190 DEBUG oslo.privsep.daemon [-] privsep: reply[75fc8c49-5e15-449c-bae2-4009953698b8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.267 163190 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.267 163190 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.267 163190 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.725 163190 DEBUG oslo.privsep.daemon [-] privsep: reply[921d3ae9-cf94-4505-ad18-91a96e961771]: (4, ['ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.727 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, column=external_ids, values=({'neutron:ovn-metadata-id': '3f69ebf8-22bb-5c58-9aa2-18a03394eccc'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.728 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.729 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.795 163055 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.795 163055 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.795 163055 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.795 163055 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.795 163055 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.796 163055 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.796 163055 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.796 163055 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.796 163055 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.796 163055 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.796 163055 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.797 163055 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.797 163055 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.797 163055 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.797 163055 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.797 163055 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.798 163055 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.798 163055 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.798 163055 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.798 163055 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.798 163055 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.798 163055 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.799 163055 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.799 163055 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.799 163055 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.799 163055 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.799 163055 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.800 163055 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.801 163055 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.801 163055 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.801 163055 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.801 163055 DEBUG oslo_service.service [-] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.801 163055 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.801 163055 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.802 163055 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.802 163055 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.802 163055 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.802 163055 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.802 163055 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.802 163055 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.803 163055 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.803 163055 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.803 163055 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.803 163055 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.803 163055 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.803 163055 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.804 163055 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.804 163055 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.804 163055 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.804 163055 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.804 163055 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.804 163055 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.805 163055 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.805 163055 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.805 163055 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.805 163055 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.805 163055 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.805 163055 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.806 163055 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.807 163055 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.807 163055 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.807 163055 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.807 163055 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.807 163055 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.807 163055 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.808 163055 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.808 163055 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.808 163055 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.808 163055 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.808 163055 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.808 163055 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.809 163055 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.810 163055 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.810 163055 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.810 163055 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.810 163055 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.810 163055 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.810 163055 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.811 163055 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.811 163055 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.811 163055 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.811 163055 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.811 163055 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.811 163055 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.812 163055 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.812 163055 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.812 163055 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.812 163055 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.812 163055 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.812 163055 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.813 163055 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.813 163055 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.813 163055 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.813 163055 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.813 163055 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.813 163055 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.814 163055 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.814 163055 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.814 163055 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.814 163055 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.814 163055 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.814 163055 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.815 163055 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.815 163055 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.815 163055 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.815 163055 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.815 163055 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.816 163055 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.816 163055 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.816 163055 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.816 163055 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.816 163055 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.816 163055 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.817 163055 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.817 163055 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.817 163055 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.817 163055 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.817 163055 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.817 163055 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.818 163055 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.819 163055 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.819 163055 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.819 163055 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.819 163055 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.819 163055 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.819 163055 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.820 163055 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.820 163055 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.820 163055 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.820 163055 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.820 163055 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.821 163055 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.822 163055 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.822 163055 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.822 163055 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.822 163055 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.822 163055 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.822 163055 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.823 163055 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.824 163055 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.824 163055 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.824 163055 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.824 163055 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.824 163055 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.824 163055 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.825 163055 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.825 163055 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.825 163055 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.825 163055 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.825 163055 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.825 163055 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.826 163055 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.827 163055 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.828 163055 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.828 163055 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.828 163055 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.828 163055 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.828 163055 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.828 163055 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.829 163055 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.830 163055 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.831 163055 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.831 163055 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.831 163055 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.831 163055 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.831 163055 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.831 163055 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.832 163055 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.832 163055 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.832 163055 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.832 163055 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.832 163055 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.832 163055 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.833 163055 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.833 163055 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.833 163055 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.833 163055 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.833 163055 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.833 163055 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.834 163055 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.834 163055 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.834 163055 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.834 163055 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.834 163055 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.834 163055 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.835 163055 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.836 163055 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.837 163055 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.837 163055 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.837 163055 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.837 163055 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.837 163055 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.837 163055 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.838 163055 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.838 163055 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.838 163055 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.838 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.838 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.838 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.839 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.840 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.840 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.840 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.840 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.840 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.840 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.841 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.842 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.842 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.842 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.842 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.842 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.842 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.843 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.843 163055 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.843 163055 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.843 163055 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.843 163055 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.843 163055 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:32:02 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:02.844 163055 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Oct 14 05:32:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=827 DF PROTO=TCP SPT=59410 DPT=9882 SEQ=3810009322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02825A0000000001030307) Oct 14 05:32:07 localhost sshd[163195]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:32:07 localhost systemd-logind[760]: New session 53 of user zuul. Oct 14 05:32:07 localhost systemd[1]: Started Session 53 of User zuul. Oct 14 05:32:08 localhost python3.9[163288]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:32:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35383 DF PROTO=TCP SPT=43484 DPT=9105 SEQ=4103453504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B028F8E0000000001030307) Oct 14 05:32:09 localhost python3.9[163384]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35384 DF PROTO=TCP SPT=43484 DPT=9105 SEQ=4103453504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02939A0000000001030307) Oct 14 05:32:10 localhost python3.9[163489]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:10 localhost systemd[1]: libpod-e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534.scope: Deactivated successfully. Oct 14 05:32:10 localhost podman[163490]: 2025-10-14 09:32:10.525458816 +0000 UTC m=+0.079483612 container died e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, architecture=x86_64, build-date=2025-07-21T14:56:59, distribution-scope=public, release=2, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0) Oct 14 05:32:10 localhost podman[163490]: 2025-10-14 09:32:10.558885761 +0000 UTC m=+0.112910557 container cleanup e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, build-date=2025-07-21T14:56:59, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., release=2, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:32:10 localhost podman[163503]: 2025-10-14 09:32:10.617354111 +0000 UTC m=+0.079135281 container remove e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, version=17.1.9, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Oct 14 05:32:10 localhost systemd[1]: libpod-conmon-e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534.scope: Deactivated successfully. Oct 14 05:32:11 localhost systemd[1]: var-lib-containers-storage-overlay-360c9d6681680c6086250bb7d9532a72bd783cf0586b4983df05bec6e05d323a-merged.mount: Deactivated successfully. Oct 14 05:32:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7d667bdb252c6a015719de25a2bb01984cf600c61bb68939738f913fb3ae534-userdata-shm.mount: Deactivated successfully. Oct 14 05:32:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35385 DF PROTO=TCP SPT=43484 DPT=9105 SEQ=4103453504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B029B9A0000000001030307) Oct 14 05:32:11 localhost python3.9[163616]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:32:11 localhost systemd[1]: Reloading. Oct 14 05:32:11 localhost systemd-rc-local-generator[163643]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:32:11 localhost systemd-sysv-generator[163646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:32:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:32:13 localhost python3.9[163742]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:32:13 localhost network[163759]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:32:13 localhost network[163760]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:32:13 localhost network[163761]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:32:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:32:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35386 DF PROTO=TCP SPT=43484 DPT=9105 SEQ=4103453504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02AB5A0000000001030307) Oct 14 05:32:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32679 DF PROTO=TCP SPT=43496 DPT=9101 SEQ=3274265830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02B41B0000000001030307) Oct 14 05:32:18 localhost python3.9[163962]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:18 localhost systemd[1]: Reloading. Oct 14 05:32:18 localhost systemd-rc-local-generator[163989]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:32:18 localhost systemd-sysv-generator[163995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:32:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:32:18 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Oct 14 05:32:19 localhost python3.9[164093]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:20 localhost python3.9[164186]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:21 localhost python3.9[164279]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:22 localhost python3.9[164372]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:22 localhost python3.9[164465]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10087 DF PROTO=TCP SPT=49132 DPT=9102 SEQ=4056967525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02C9150000000001030307) Oct 14 05:32:23 localhost python3.9[164558]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:32:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10088 DF PROTO=TCP SPT=49132 DPT=9102 SEQ=4056967525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02CD1A0000000001030307) Oct 14 05:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:32:24 localhost systemd[1]: tmp-crun.9qicFW.mount: Deactivated successfully. Oct 14 05:32:24 localhost podman[164574]: 2025-10-14 09:32:24.766510484 +0000 UTC m=+0.103253248 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:32:24 localhost podman[164574]: 2025-10-14 09:32:24.848341498 +0000 UTC m=+0.185084272 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 05:32:24 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:32:26 localhost systemd[1]: tmp-crun.gVQvvf.mount: Deactivated successfully. Oct 14 05:32:26 localhost podman[164599]: 2025-10-14 09:32:26.73582808 +0000 UTC m=+0.080289097 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent) Oct 14 05:32:26 localhost podman[164599]: 2025-10-14 09:32:26.740309919 +0000 UTC m=+0.084770956 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:32:26 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:32:27 localhost python3.9[164694]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:28 localhost python3.9[164786]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21162 DF PROTO=TCP SPT=60120 DPT=9882 SEQ=124716093 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02DBDB0000000001030307) Oct 14 05:32:28 localhost python3.9[164878]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:29 localhost python3.9[164970]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:29 localhost python3.9[165062]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:30 localhost python3.9[165154]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10090 DF PROTO=TCP SPT=49132 DPT=9102 SEQ=4056967525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02E4DA0000000001030307) Oct 14 05:32:31 localhost python3.9[165246]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:31 localhost python3.9[165338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:32 localhost python3.9[165430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:33 localhost python3.9[165522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:33 localhost python3.9[165614]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:34 localhost python3.9[165706]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:35 localhost python3.9[165798]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21165 DF PROTO=TCP SPT=60120 DPT=9882 SEQ=124716093 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B02F79A0000000001030307) Oct 14 05:32:35 localhost python3.9[165890]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:32:36 localhost python3.9[165982]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:37 localhost python3.9[166074]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:32:38 localhost python3.9[166166]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:32:38 localhost systemd[1]: Reloading. Oct 14 05:32:38 localhost systemd-rc-local-generator[166191]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:32:38 localhost systemd-sysv-generator[166196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:32:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:32:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33401 DF PROTO=TCP SPT=51824 DPT=9105 SEQ=840869058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0304C00000000001030307) Oct 14 05:32:39 localhost python3.9[166294]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33402 DF PROTO=TCP SPT=51824 DPT=9105 SEQ=840869058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0308DA0000000001030307) Oct 14 05:32:40 localhost python3.9[166387]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:40 localhost python3.9[166480]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:32:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4960 writes, 22K keys, 4960 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4960 writes, 649 syncs, 7.64 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5613efba7610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Oct 14 05:32:41 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 76.0 (253 of 333 items), suggesting rotation. Oct 14 05:32:41 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:32:41 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:32:41 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:32:41 localhost python3.9[166573]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33403 DF PROTO=TCP SPT=51824 DPT=9105 SEQ=840869058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0310DB0000000001030307) Oct 14 05:32:42 localhost python3.9[166667]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:42 localhost python3.9[166760]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:43 localhost python3.9[166853]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:32:45 localhost python3.9[166946]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Oct 14 05:32:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:32:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5551 writes, 24K keys, 5551 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5551 writes, 763 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 8.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b0d22fc2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 8.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Oct 14 05:32:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33404 DF PROTO=TCP SPT=51824 DPT=9105 SEQ=840869058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03209B0000000001030307) Oct 14 05:32:46 localhost python3.9[167039]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Oct 14 05:32:47 localhost python3.9[167137]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486733.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Oct 14 05:32:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43788 DF PROTO=TCP SPT=58076 DPT=9101 SEQ=1774007617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03295A0000000001030307) Oct 14 05:32:48 localhost python3.9[167237]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:32:49 localhost python3.9[167291]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:32:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37096 DF PROTO=TCP SPT=56274 DPT=9102 SEQ=302787342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B033E450000000001030307) Oct 14 05:32:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37097 DF PROTO=TCP SPT=56274 DPT=9102 SEQ=302787342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03425B0000000001030307) Oct 14 05:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:32:55 localhost podman[167431]: 2025-10-14 09:32:55.762939279 +0000 UTC m=+0.094780135 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 05:32:55 localhost podman[167431]: 2025-10-14 09:32:55.79862477 +0000 UTC m=+0.130465636 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 05:32:55 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:32:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:57.736 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:32:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:57.737 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:32:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:32:57.738 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:32:57 localhost podman[167463]: 2025-10-14 09:32:57.740560125 +0000 UTC m=+0.081637032 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 05:32:57 localhost podman[167463]: 2025-10-14 09:32:57.74911129 +0000 UTC m=+0.090188167 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Oct 14 05:32:57 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:32:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63232 DF PROTO=TCP SPT=50568 DPT=9882 SEQ=1688898643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03510B0000000001030307) Oct 14 05:33:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37099 DF PROTO=TCP SPT=56274 DPT=9102 SEQ=302787342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B035A1A0000000001030307) Oct 14 05:33:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63235 DF PROTO=TCP SPT=50568 DPT=9882 SEQ=1688898643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B036CDA0000000001030307) Oct 14 05:33:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28585 DF PROTO=TCP SPT=52274 DPT=9105 SEQ=3043239435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0379EE0000000001030307) Oct 14 05:33:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28586 DF PROTO=TCP SPT=52274 DPT=9105 SEQ=3043239435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B037DDB0000000001030307) Oct 14 05:33:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28587 DF PROTO=TCP SPT=52274 DPT=9105 SEQ=3043239435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0385DA0000000001030307) Oct 14 05:33:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28588 DF PROTO=TCP SPT=52274 DPT=9105 SEQ=3043239435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03959A0000000001030307) Oct 14 05:33:18 localhost kernel: SELinux: Converting 2759 SID table entries... Oct 14 05:33:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:33:18 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:33:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:33:18 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:33:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:33:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:33:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:33:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14091 DF PROTO=TCP SPT=34788 DPT=9101 SEQ=560592582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B039E9B0000000001030307) Oct 14 05:33:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17326 DF PROTO=TCP SPT=39634 DPT=9102 SEQ=1880428954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03B3740000000001030307) Oct 14 05:33:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17327 DF PROTO=TCP SPT=39634 DPT=9102 SEQ=1880428954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03B79A0000000001030307) Oct 14 05:33:26 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=19 res=1 Oct 14 05:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:33:26 localhost systemd[1]: tmp-crun.57g6Bx.mount: Deactivated successfully. Oct 14 05:33:26 localhost podman[168483]: 2025-10-14 09:33:26.769857227 +0000 UTC m=+0.089677291 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.vendor=CentOS) Oct 14 05:33:26 localhost podman[168483]: 2025-10-14 09:33:26.831251165 +0000 UTC m=+0.151071149 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:33:26 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:33:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52242 DF PROTO=TCP SPT=33082 DPT=9882 SEQ=4017625371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03C63A0000000001030307) Oct 14 05:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:33:28 localhost podman[168509]: 2025-10-14 09:33:28.729489722 +0000 UTC m=+0.073036505 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Oct 14 05:33:28 localhost podman[168509]: 2025-10-14 09:33:28.762523647 +0000 UTC m=+0.106070420 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 05:33:28 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:33:29 localhost kernel: SELinux: Converting 2762 SID table entries... Oct 14 05:33:29 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:33:29 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:33:29 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:33:29 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:33:29 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:33:29 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:33:29 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:33:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17329 DF PROTO=TCP SPT=39634 DPT=9102 SEQ=1880428954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03CF5A0000000001030307) Oct 14 05:33:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52245 DF PROTO=TCP SPT=33082 DPT=9882 SEQ=4017625371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03E21A0000000001030307) Oct 14 05:33:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62589 DF PROTO=TCP SPT=40678 DPT=9105 SEQ=4287782986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03EF1F0000000001030307) Oct 14 05:33:39 localhost kernel: SELinux: Converting 2762 SID table entries... Oct 14 05:33:39 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:33:39 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:33:39 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:33:39 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:33:39 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:33:39 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:33:39 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:33:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62590 DF PROTO=TCP SPT=40678 DPT=9105 SEQ=4287782986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03F31B0000000001030307) Oct 14 05:33:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62591 DF PROTO=TCP SPT=40678 DPT=9105 SEQ=4287782986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B03FB1A0000000001030307) Oct 14 05:33:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62592 DF PROTO=TCP SPT=40678 DPT=9105 SEQ=4287782986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B040ADA0000000001030307) Oct 14 05:33:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41 DF PROTO=TCP SPT=48514 DPT=9101 SEQ=1582038156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04139A0000000001030307) Oct 14 05:33:48 localhost kernel: SELinux: Converting 2762 SID table entries... Oct 14 05:33:48 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:33:48 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:33:48 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:33:48 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:33:48 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:33:48 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:33:48 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:33:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30586 DF PROTO=TCP SPT=45602 DPT=9102 SEQ=1510412221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0428A50000000001030307) Oct 14 05:33:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30587 DF PROTO=TCP SPT=45602 DPT=9102 SEQ=1510412221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B042C9A0000000001030307) Oct 14 05:33:55 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=22 res=1 Oct 14 05:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:33:57 localhost podman[168642]: 2025-10-14 09:33:57.413950548 +0000 UTC m=+0.083054658 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller) Oct 14 05:33:57 localhost podman[168642]: 2025-10-14 09:33:57.48058378 +0000 UTC m=+0.149687890 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:33:57 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:33:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:33:57.737 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:33:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:33:57.738 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:33:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:33:57.739 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:33:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20178 DF PROTO=TCP SPT=45744 DPT=9882 SEQ=2168878291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B043B6B0000000001030307) Oct 14 05:33:58 localhost kernel: SELinux: Converting 2762 SID table entries... Oct 14 05:33:58 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:33:58 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:33:58 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:33:58 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:33:58 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:33:58 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:33:58 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:33:59 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=23 res=1 Oct 14 05:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:33:59 localhost systemd[1]: tmp-crun.SvtreD.mount: Deactivated successfully. Oct 14 05:33:59 localhost podman[168674]: 2025-10-14 09:33:59.733263049 +0000 UTC m=+0.068085667 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:33:59 localhost podman[168674]: 2025-10-14 09:33:59.767119611 +0000 UTC m=+0.101942209 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0) Oct 14 05:33:59 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:34:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30589 DF PROTO=TCP SPT=45602 DPT=9102 SEQ=1510412221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04445A0000000001030307) Oct 14 05:34:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20181 DF PROTO=TCP SPT=45744 DPT=9882 SEQ=2168878291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04571A0000000001030307) Oct 14 05:34:07 localhost kernel: SELinux: Converting 2762 SID table entries... Oct 14 05:34:07 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:34:07 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:34:07 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:34:07 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:34:07 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:34:07 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:34:07 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:34:07 localhost systemd[1]: Reloading. Oct 14 05:34:07 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=24 res=1 Oct 14 05:34:07 localhost systemd-rc-local-generator[168723]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:34:07 localhost systemd-sysv-generator[168726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:34:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:34:08 localhost systemd[1]: Reloading. Oct 14 05:34:08 localhost systemd-sysv-generator[168761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:34:08 localhost systemd-rc-local-generator[168758]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:34:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:34:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54428 DF PROTO=TCP SPT=49996 DPT=9105 SEQ=1368156015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04644F0000000001030307) Oct 14 05:34:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54429 DF PROTO=TCP SPT=49996 DPT=9105 SEQ=1368156015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04685B0000000001030307) Oct 14 05:34:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54430 DF PROTO=TCP SPT=49996 DPT=9105 SEQ=1368156015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04705A0000000001030307) Oct 14 05:34:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54431 DF PROTO=TCP SPT=49996 DPT=9105 SEQ=1368156015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04801A0000000001030307) Oct 14 05:34:17 localhost kernel: SELinux: Converting 2763 SID table entries... Oct 14 05:34:17 localhost kernel: SELinux: policy capability network_peer_controls=1 Oct 14 05:34:17 localhost kernel: SELinux: policy capability open_perms=1 Oct 14 05:34:17 localhost kernel: SELinux: policy capability extended_socket_class=1 Oct 14 05:34:17 localhost kernel: SELinux: policy capability always_check_network=0 Oct 14 05:34:17 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Oct 14 05:34:17 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 14 05:34:17 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Oct 14 05:34:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7779 DF PROTO=TCP SPT=35144 DPT=9101 SEQ=4272267856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0488DB0000000001030307) Oct 14 05:34:21 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Oct 14 05:34:21 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=25 res=1 Oct 14 05:34:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24473 DF PROTO=TCP SPT=43392 DPT=9102 SEQ=2081107660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B049DD50000000001030307) Oct 14 05:34:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24474 DF PROTO=TCP SPT=43392 DPT=9102 SEQ=2081107660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04A1DA0000000001030307) Oct 14 05:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:34:27 localhost systemd[1]: tmp-crun.vJO6z1.mount: Deactivated successfully. Oct 14 05:34:27 localhost podman[169017]: 2025-10-14 09:34:27.783916975 +0000 UTC m=+0.115018253 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:34:27 localhost podman[169017]: 2025-10-14 09:34:27.850107943 +0000 UTC m=+0.181209191 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:34:27 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:34:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29227 DF PROTO=TCP SPT=40934 DPT=9882 SEQ=2353443793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04B09B0000000001030307) Oct 14 05:34:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24476 DF PROTO=TCP SPT=43392 DPT=9102 SEQ=2081107660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04B99A0000000001030307) Oct 14 05:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:34:30 localhost systemd[1]: tmp-crun.434NOE.mount: Deactivated successfully. Oct 14 05:34:30 localhost podman[169040]: 2025-10-14 09:34:30.741287671 +0000 UTC m=+0.084656477 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:34:30 localhost podman[169040]: 2025-10-14 09:34:30.746286355 +0000 UTC m=+0.089655131 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:34:30 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:34:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29230 DF PROTO=TCP SPT=40934 DPT=9882 SEQ=2353443793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04CC5B0000000001030307) Oct 14 05:34:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7294 DF PROTO=TCP SPT=56818 DPT=9105 SEQ=3716573666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04D97E0000000001030307) Oct 14 05:34:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7295 DF PROTO=TCP SPT=56818 DPT=9105 SEQ=3716573666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04DD9A0000000001030307) Oct 14 05:34:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7296 DF PROTO=TCP SPT=56818 DPT=9105 SEQ=3716573666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04E59A0000000001030307) Oct 14 05:34:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7297 DF PROTO=TCP SPT=56818 DPT=9105 SEQ=3716573666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04F55A0000000001030307) Oct 14 05:34:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52045 DF PROTO=TCP SPT=39362 DPT=9101 SEQ=2400505781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B04FE1B0000000001030307) Oct 14 05:34:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4361 DF PROTO=TCP SPT=37834 DPT=9102 SEQ=1126919389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0513040000000001030307) Oct 14 05:34:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4362 DF PROTO=TCP SPT=37834 DPT=9102 SEQ=1126919389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05171A0000000001030307) Oct 14 05:34:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:34:57.739 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:34:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:34:57.739 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:34:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:34:57.741 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:34:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32397 DF PROTO=TCP SPT=51012 DPT=9882 SEQ=4163532214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0525CB0000000001030307) Oct 14 05:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:34:58 localhost podman[185879]: 2025-10-14 09:34:58.755056312 +0000 UTC m=+0.089444890 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 05:34:58 localhost podman[185879]: 2025-10-14 09:34:58.845400708 +0000 UTC m=+0.179789266 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0) Oct 14 05:34:58 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:35:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4364 DF PROTO=TCP SPT=37834 DPT=9102 SEQ=1126919389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B052EDB0000000001030307) Oct 14 05:35:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:35:01 localhost podman[186187]: 2025-10-14 09:35:01.08520046 +0000 UTC m=+0.095272923 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:35:01 localhost podman[186187]: 2025-10-14 09:35:01.121100289 +0000 UTC m=+0.131172702 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent) Oct 14 05:35:01 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:35:02 localhost systemd[1]: Stopping OpenSSH server daemon... Oct 14 05:35:02 localhost systemd[1]: sshd.service: Deactivated successfully. Oct 14 05:35:02 localhost systemd[1]: Stopped OpenSSH server daemon. Oct 14 05:35:02 localhost systemd[1]: sshd.service: Consumed 2.633s CPU time, read 0B from disk, written 24.0K to disk. Oct 14 05:35:02 localhost systemd[1]: Stopped target sshd-keygen.target. Oct 14 05:35:02 localhost systemd[1]: Stopping sshd-keygen.target... Oct 14 05:35:02 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:35:02 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:35:02 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Oct 14 05:35:02 localhost systemd[1]: Reached target sshd-keygen.target. Oct 14 05:35:02 localhost systemd[1]: Starting OpenSSH server daemon... Oct 14 05:35:02 localhost sshd[186865]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:35:02 localhost systemd[1]: Started OpenSSH server daemon. Oct 14 05:35:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 05:35:04 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 05:35:04 localhost systemd[1]: Reloading. Oct 14 05:35:04 localhost systemd-rc-local-generator[187099]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:04 localhost systemd-sysv-generator[187103]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:04 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 05:35:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 05:35:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32400 DF PROTO=TCP SPT=51012 DPT=9882 SEQ=4163532214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05419A0000000001030307) Oct 14 05:35:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54483 DF PROTO=TCP SPT=45582 DPT=9105 SEQ=1893927366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B054EAF0000000001030307) Oct 14 05:35:08 localhost python3.9[192569]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:35:09 localhost systemd[1]: Reloading. Oct 14 05:35:09 localhost systemd-rc-local-generator[192960]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:09 localhost systemd-sysv-generator[192967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54484 DF PROTO=TCP SPT=45582 DPT=9105 SEQ=1893927366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05529A0000000001030307) Oct 14 05:35:10 localhost python3.9[193316]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:35:10 localhost systemd[1]: Reloading. Oct 14 05:35:10 localhost systemd-rc-local-generator[193442]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:10 localhost systemd-sysv-generator[193449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:11 localhost python3.9[193857]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:35:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54485 DF PROTO=TCP SPT=45582 DPT=9105 SEQ=1893927366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B055A9A0000000001030307) Oct 14 05:35:12 localhost systemd[1]: Reloading. Oct 14 05:35:12 localhost systemd-rc-local-generator[194529]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:12 localhost systemd-sysv-generator[194533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:13 localhost python3.9[194949]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:35:13 localhost systemd[1]: Reloading. Oct 14 05:35:13 localhost systemd-sysv-generator[195192]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:13 localhost systemd-rc-local-generator[195186]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54486 DF PROTO=TCP SPT=45582 DPT=9105 SEQ=1893927366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B056A5A0000000001030307) Oct 14 05:35:16 localhost python3.9[196249]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:16 localhost systemd[1]: Reloading. Oct 14 05:35:16 localhost systemd-sysv-generator[196448]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:16 localhost systemd-rc-local-generator[196443]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:16 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 05:35:16 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 05:35:16 localhost systemd[1]: man-db-cache-update.service: Consumed 14.496s CPU time. Oct 14 05:35:16 localhost systemd[1]: run-rc5382f7e4a964ccf9242fb1944535a48.service: Deactivated successfully. Oct 14 05:35:16 localhost systemd[1]: run-r81e369f8be2542bfa6e0fbf107380c52.service: Deactivated successfully. Oct 14 05:35:17 localhost python3.9[196727]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:17 localhost systemd[1]: Reloading. Oct 14 05:35:17 localhost systemd-sysv-generator[196755]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:17 localhost systemd-rc-local-generator[196752]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37497 DF PROTO=TCP SPT=46452 DPT=9101 SEQ=3724358897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05735A0000000001030307) Oct 14 05:35:18 localhost python3.9[196875]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:18 localhost systemd[1]: Reloading. Oct 14 05:35:18 localhost systemd-rc-local-generator[196903]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:18 localhost systemd-sysv-generator[196907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:19 localhost python3.9[197026]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:20 localhost python3.9[197139]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:20 localhost systemd[1]: Reloading. Oct 14 05:35:20 localhost systemd-rc-local-generator[197167]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:20 localhost systemd-sysv-generator[197171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:21 localhost python3.9[197288]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:35:21 localhost systemd[1]: Reloading. Oct 14 05:35:21 localhost systemd-rc-local-generator[197315]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:35:21 localhost systemd-sysv-generator[197320]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:35:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:35:23 localhost python3.9[197437]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16630 DF PROTO=TCP SPT=52268 DPT=9102 SEQ=2648036432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0588350000000001030307) Oct 14 05:35:23 localhost python3.9[197550]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16631 DF PROTO=TCP SPT=52268 DPT=9102 SEQ=2648036432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B058C5A0000000001030307) Oct 14 05:35:24 localhost python3.9[197663]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:25 localhost python3.9[197776]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:26 localhost python3.9[197889]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:27 localhost python3.9[198002]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22550 DF PROTO=TCP SPT=35278 DPT=9882 SEQ=3057711193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B059AFA0000000001030307) Oct 14 05:35:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:35:29 localhost podman[198005]: 2025-10-14 09:35:29.051432105 +0000 UTC m=+0.091341696 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:35:29 localhost podman[198005]: 2025-10-14 09:35:29.163619976 +0000 UTC m=+0.203529517 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:35:29 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:35:29 localhost python3.9[198140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:30 localhost python3.9[198253]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16633 DF PROTO=TCP SPT=52268 DPT=9102 SEQ=2648036432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05A41A0000000001030307) Oct 14 05:35:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:35:31 localhost podman[198367]: 2025-10-14 09:35:31.276314106 +0000 UTC m=+0.071949534 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:35:31 localhost podman[198367]: 2025-10-14 09:35:31.31035912 +0000 UTC m=+0.105994558 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 05:35:31 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:35:31 localhost python3.9[198366]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:32 localhost python3.9[198499]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:33 localhost python3.9[198612]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:34 localhost python3.9[198725]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22553 DF PROTO=TCP SPT=35278 DPT=9882 SEQ=3057711193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05B6DA0000000001030307) Oct 14 05:35:35 localhost python3.9[198838]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:36 localhost python3.9[198951]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Oct 14 05:35:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49250 DF PROTO=TCP SPT=47998 DPT=9105 SEQ=1573918442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05C3DE0000000001030307) Oct 14 05:35:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49251 DF PROTO=TCP SPT=47998 DPT=9105 SEQ=1573918442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05C7DB0000000001030307) Oct 14 05:35:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49252 DF PROTO=TCP SPT=47998 DPT=9105 SEQ=1573918442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05CFDA0000000001030307) Oct 14 05:35:42 localhost python3.9[199064]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:35:42 localhost python3.9[199174]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:35:43 localhost python3.9[199284]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:35:44 localhost python3.9[199394]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:35:44 localhost python3.9[199504]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:35:45 localhost python3.9[199614]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:35:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49253 DF PROTO=TCP SPT=47998 DPT=9105 SEQ=1573918442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05DF9B0000000001030307) Oct 14 05:35:46 localhost python3.9[199724]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:47 localhost python3.9[199814]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434545.8439345-1645-8209128922198/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:47 localhost python3.9[199924]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58599 DF PROTO=TCP SPT=40900 DPT=9101 SEQ=1141442931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05E85B0000000001030307) Oct 14 05:35:48 localhost python3.9[200014]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434547.319589-1645-42609012382450/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:49 localhost python3.9[200124]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:49 localhost python3.9[200214]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434548.576033-1645-154785851533254/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:50 localhost python3.9[200324]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:50 localhost python3.9[200414]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434549.7696586-1645-49043748345634/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:51 localhost python3.9[200524]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:52 localhost python3.9[200614]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434550.8859184-1645-47505979216602/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:53 localhost python3.9[200724]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1630 DF PROTO=TCP SPT=49812 DPT=9102 SEQ=3832183123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B05FD650000000001030307) Oct 14 05:35:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1631 DF PROTO=TCP SPT=49812 DPT=9102 SEQ=3832183123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06015B0000000001030307) Oct 14 05:35:54 localhost python3.9[200814]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434552.9088936-1645-52378309067666/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:55 localhost python3.9[200924]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:56 localhost python3.9[201012]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434555.187205-1645-249420327082205/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:56 localhost python3.9[201122]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:35:57 localhost python3.9[201212]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434556.4117699-1645-27676183603805/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:35:57.740 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:35:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:35:57.741 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:35:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:35:57.742 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:35:58 localhost python3.9[201322]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48621 DF PROTO=TCP SPT=59374 DPT=9882 SEQ=1271479570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06102B0000000001030307) Oct 14 05:35:58 localhost python3.9[201432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:35:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:35:59 localhost podman[201543]: 2025-10-14 09:35:59.53087191 +0000 UTC m=+0.083586867 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller) Oct 14 05:35:59 localhost podman[201543]: 2025-10-14 09:35:59.624916755 +0000 UTC m=+0.177631662 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:35:59 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:35:59 localhost python3.9[201542]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:00 localhost python3.9[201677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1633 DF PROTO=TCP SPT=49812 DPT=9102 SEQ=3832183123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06191A0000000001030307) Oct 14 05:36:00 localhost python3.9[201787]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:36:01 localhost podman[201933]: 2025-10-14 09:36:01.494144712 +0000 UTC m=+0.086651408 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Oct 14 05:36:01 localhost podman[201933]: 2025-10-14 09:36:01.531967267 +0000 UTC m=+0.124473893 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:36:01 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:36:01 localhost python3.9[201934]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:02 localhost python3.9[202131]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:02 localhost python3.9[202258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:03 localhost python3.9[202386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:04 localhost python3.9[202496]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:04 localhost python3.9[202606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48624 DF PROTO=TCP SPT=59374 DPT=9882 SEQ=1271479570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B062BDB0000000001030307) Oct 14 05:36:06 localhost python3.9[202716]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:07 localhost python3.9[202826]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:07 localhost python3.9[202936]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42183 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=713188834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06390F0000000001030307) Oct 14 05:36:09 localhost python3.9[203046]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42184 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=713188834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B063D1A0000000001030307) Oct 14 05:36:10 localhost python3.9[203156]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:10 localhost python3.9[203244]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434569.4981556-2308-75174614270075/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:11 localhost python3.9[203354]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:11 localhost python3.9[203442]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434570.669112-2308-152527502057810/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42185 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=713188834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06451A0000000001030307) Oct 14 05:36:12 localhost python3.9[203552]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:12 localhost python3.9[203640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434571.8432286-2308-40762597510768/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:13 localhost python3.9[203750]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:14 localhost python3.9[203838]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434573.1065416-2308-216936535286255/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:15 localhost python3.9[203948]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:15 localhost python3.9[204036]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434574.5396855-2308-8356775947083/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42186 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=713188834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0654DB0000000001030307) Oct 14 05:36:16 localhost python3.9[204146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:16 localhost python3.9[204234]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434575.7599137-2308-2436544490434/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:17 localhost python3.9[204344]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21685 DF PROTO=TCP SPT=52826 DPT=9101 SEQ=2877925251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B065D9A0000000001030307) Oct 14 05:36:18 localhost python3.9[204432]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434577.0916412-2308-95499959656500/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:18 localhost python3.9[204542]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:19 localhost python3.9[204630]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434578.3724988-2308-59703791299443/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:20 localhost python3.9[204740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:21 localhost python3.9[204828]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434579.580303-2308-204489911429289/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:21 localhost python3.9[204938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:22 localhost python3.9[205026]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434581.3107278-2308-151032628513238/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:23 localhost python3.9[205136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5699 DF PROTO=TCP SPT=33336 DPT=9102 SEQ=382198698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0672940000000001030307) Oct 14 05:36:23 localhost python3.9[205224]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434582.946799-2308-164174346613587/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5700 DF PROTO=TCP SPT=33336 DPT=9102 SEQ=382198698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06769A0000000001030307) Oct 14 05:36:24 localhost python3.9[205334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:25 localhost python3.9[205422]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434584.1516902-2308-123877979074597/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:25 localhost python3.9[205532]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:26 localhost python3.9[205620]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434585.342569-2308-209121951517927/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:26 localhost python3.9[205730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:27 localhost python3.9[205818]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434586.477157-2308-88305397877089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47301 DF PROTO=TCP SPT=37220 DPT=9882 SEQ=3753610672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06855A0000000001030307) Oct 14 05:36:28 localhost python3.9[205926]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:36:28 localhost sshd[205947]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:36:29 localhost python3.9[206041]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Oct 14 05:36:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:36:30 localhost podman[206152]: 2025-10-14 09:36:30.269237594 +0000 UTC m=+0.096010170 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:36:30 localhost podman[206152]: 2025-10-14 09:36:30.340221803 +0000 UTC m=+0.166994439 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller) Oct 14 05:36:30 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:36:30 localhost python3.9[206151]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:36:30 localhost systemd[1]: Reloading. Oct 14 05:36:30 localhost systemd-rc-local-generator[206200]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:36:30 localhost systemd-sysv-generator[206204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:36:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5702 DF PROTO=TCP SPT=33336 DPT=9102 SEQ=382198698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B068E5B0000000001030307) Oct 14 05:36:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:36:31 localhost podman[206211]: 2025-10-14 09:36:31.732340124 +0000 UTC m=+0.071570398 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:36:31 localhost podman[206211]: 2025-10-14 09:36:31.765181201 +0000 UTC m=+0.104411505 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Oct 14 05:36:31 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:36:31 localhost systemd[1]: Starting libvirt logging daemon socket... Oct 14 05:36:31 localhost systemd[1]: Listening on libvirt logging daemon socket. Oct 14 05:36:31 localhost systemd[1]: Starting libvirt logging daemon admin socket... Oct 14 05:36:31 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Oct 14 05:36:31 localhost systemd[1]: Starting libvirt logging daemon... Oct 14 05:36:31 localhost systemd[1]: Started libvirt logging daemon. Oct 14 05:36:32 localhost python3.9[206345]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:36:32 localhost systemd[1]: Reloading. Oct 14 05:36:32 localhost systemd-rc-local-generator[206371]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:36:32 localhost systemd-sysv-generator[206374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:36:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:36:34 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Oct 14 05:36:34 localhost systemd[1]: Starting libvirt nodedev daemon socket... Oct 14 05:36:34 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Oct 14 05:36:34 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Oct 14 05:36:34 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Oct 14 05:36:34 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Oct 14 05:36:34 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Oct 14 05:36:34 localhost systemd[1]: Starting libvirt nodedev daemon... Oct 14 05:36:34 localhost systemd[1]: Started libvirt nodedev daemon. Oct 14 05:36:34 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Oct 14 05:36:34 localhost setroubleshoot[206383]: Deleting alert 5b2f497b-86cf-43db-9090-4f1c5a3a1db8, it is allowed in current policy Oct 14 05:36:34 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Oct 14 05:36:34 localhost python3.9[206527]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:36:34 localhost systemd[1]: Reloading. Oct 14 05:36:34 localhost systemd-rc-local-generator[206553]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:36:34 localhost systemd-sysv-generator[206556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:36:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:36:35 localhost systemd[1]: Starting libvirt proxy daemon socket... Oct 14 05:36:35 localhost systemd[1]: Listening on libvirt proxy daemon socket. Oct 14 05:36:35 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Oct 14 05:36:35 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Oct 14 05:36:35 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Oct 14 05:36:35 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Oct 14 05:36:35 localhost systemd[1]: Starting libvirt proxy daemon... Oct 14 05:36:35 localhost systemd[1]: Started libvirt proxy daemon. Oct 14 05:36:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47304 DF PROTO=TCP SPT=37220 DPT=9882 SEQ=3753610672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06A11B0000000001030307) Oct 14 05:36:35 localhost setroubleshoot[206383]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l bd82ff95-a75d-47a6-b059-c3e3d3e819a8 Oct 14 05:36:35 localhost setroubleshoot[206383]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Oct 14 05:36:35 localhost setroubleshoot[206383]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l bd82ff95-a75d-47a6-b059-c3e3d3e819a8 Oct 14 05:36:35 localhost setroubleshoot[206383]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Oct 14 05:36:35 localhost python3.9[206699]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:36:35 localhost systemd[1]: Reloading. Oct 14 05:36:36 localhost systemd-rc-local-generator[206727]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:36:36 localhost systemd-sysv-generator[206730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:36:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:36:36 localhost systemd[1]: Listening on libvirt locking daemon socket. Oct 14 05:36:36 localhost systemd[1]: Starting libvirt QEMU daemon socket... Oct 14 05:36:36 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Oct 14 05:36:36 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Oct 14 05:36:36 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Oct 14 05:36:36 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Oct 14 05:36:36 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Oct 14 05:36:36 localhost systemd[1]: Starting libvirt QEMU daemon... Oct 14 05:36:36 localhost systemd[1]: Started libvirt QEMU daemon. Oct 14 05:36:37 localhost python3.9[206889]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:36:37 localhost systemd[1]: Reloading. Oct 14 05:36:37 localhost systemd-sysv-generator[206918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:36:37 localhost systemd-rc-local-generator[206914]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:36:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:36:37 localhost systemd[1]: Starting libvirt secret daemon socket... Oct 14 05:36:37 localhost systemd[1]: Listening on libvirt secret daemon socket. Oct 14 05:36:37 localhost systemd[1]: Starting libvirt secret daemon admin socket... Oct 14 05:36:37 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Oct 14 05:36:37 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Oct 14 05:36:37 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Oct 14 05:36:37 localhost systemd[1]: Starting libvirt secret daemon... Oct 14 05:36:37 localhost systemd[1]: Started libvirt secret daemon. Oct 14 05:36:38 localhost python3.9[207061]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15171 DF PROTO=TCP SPT=38150 DPT=9105 SEQ=4165855793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06AE3E0000000001030307) Oct 14 05:36:39 localhost python3.9[207171]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:36:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15172 DF PROTO=TCP SPT=38150 DPT=9105 SEQ=4165855793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06B25A0000000001030307) Oct 14 05:36:39 localhost python3.9[207281]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:36:40 localhost python3.9[207393]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:36:41 localhost python3.9[207501]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15173 DF PROTO=TCP SPT=38150 DPT=9105 SEQ=4165855793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06BA5A0000000001030307) Oct 14 05:36:42 localhost python3.9[207587]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434601.1619728-3172-104972149534091/.source.xml follow=False _original_basename=secret.xml.j2 checksum=a98993dd7f9443820dd0c69ee661382763176cb0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:42 localhost python3.9[207697]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine fcadf6e2-9176-5818-a8d0-37b19acf8eaf#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:36:43 localhost python3.9[207817]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:45 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Oct 14 05:36:45 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Oct 14 05:36:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15174 DF PROTO=TCP SPT=38150 DPT=9105 SEQ=4165855793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06CA1B0000000001030307) Oct 14 05:36:46 localhost python3.9[208155]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:46 localhost python3.9[208265]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:47 localhost python3.9[208353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434606.2507608-3338-35619732659414/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47559 DF PROTO=TCP SPT=44578 DPT=9101 SEQ=791738887 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06D2DA0000000001030307) Oct 14 05:36:49 localhost python3.9[208463]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:49 localhost python3.9[208573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:50 localhost python3.9[208630]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:51 localhost python3.9[208740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:52 localhost python3.9[208797]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r6mnt6wk recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:53 localhost python3.9[208907]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28895 DF PROTO=TCP SPT=37260 DPT=9102 SEQ=3583412368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06E7C50000000001030307) Oct 14 05:36:53 localhost python3.9[208964]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:54 localhost python3.9[209074]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:36:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28896 DF PROTO=TCP SPT=37260 DPT=9102 SEQ=3583412368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06EBDA0000000001030307) Oct 14 05:36:55 localhost python3[209185]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Oct 14 05:36:56 localhost python3.9[209295]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:56 localhost python3.9[209352]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:57 localhost python3.9[209462]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:36:57.741 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:36:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:36:57.742 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:36:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:36:57.744 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:36:57 localhost python3.9[209519]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32839 DF PROTO=TCP SPT=53902 DPT=9882 SEQ=2295134318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B06FA8A0000000001030307) Oct 14 05:36:58 localhost python3.9[209629]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:36:59 localhost python3.9[209686]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:36:59 localhost python3.9[209796]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:00 localhost python3.9[209853]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28898 DF PROTO=TCP SPT=37260 DPT=9102 SEQ=3583412368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07039B0000000001030307) Oct 14 05:37:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:37:00 localhost podman[209887]: 2025-10-14 09:37:00.74737775 +0000 UTC m=+0.087390228 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:37:00 localhost podman[209887]: 2025-10-14 09:37:00.807132401 +0000 UTC m=+0.147144859 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:37:00 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:37:01 localhost python3.9[209990]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:01 localhost python3.9[210080]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434620.647825-3713-160931937653531/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:37:02 localhost systemd[1]: tmp-crun.f9rSfJ.mount: Deactivated successfully. Oct 14 05:37:02 localhost podman[210191]: 2025-10-14 09:37:02.498428484 +0000 UTC m=+0.087803780 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:37:02 localhost podman[210191]: 2025-10-14 09:37:02.503312926 +0000 UTC m=+0.092688202 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0) Oct 14 05:37:02 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:37:02 localhost python3.9[210190]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:04 localhost python3.9[210385]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:37:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32842 DF PROTO=TCP SPT=53902 DPT=9882 SEQ=2295134318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07165A0000000001030307) Oct 14 05:37:05 localhost python3.9[210516]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:07 localhost python3.9[210626]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:37:07 localhost python3.9[210737]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:37:08 localhost python3.9[210849]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:37:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45787 DF PROTO=TCP SPT=38230 DPT=9105 SEQ=3419366415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07236E0000000001030307) Oct 14 05:37:09 localhost python3.9[210962]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45788 DF PROTO=TCP SPT=38230 DPT=9105 SEQ=3419366415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07275A0000000001030307) Oct 14 05:37:10 localhost python3.9[211072]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:10 localhost python3.9[211160]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434629.5225205-3928-230109349032626/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:11 localhost python3.9[211270]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:11 localhost python3.9[211358]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434630.7559109-3973-244909119771583/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45789 DF PROTO=TCP SPT=38230 DPT=9105 SEQ=3419366415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B072F5A0000000001030307) Oct 14 05:37:12 localhost python3.9[211468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:13 localhost python3.9[211556]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434632.0673199-4019-56389346841985/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:14 localhost python3.9[211666]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:37:14 localhost systemd[1]: Reloading. Oct 14 05:37:14 localhost systemd-rc-local-generator[211691]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:14 localhost systemd-sysv-generator[211695]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:14 localhost systemd[1]: Reached target edpm_libvirt.target. Oct 14 05:37:15 localhost python3.9[211817]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Oct 14 05:37:15 localhost systemd[1]: Reloading. Oct 14 05:37:15 localhost systemd-rc-local-generator[211845]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:15 localhost systemd-sysv-generator[211848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:15 localhost systemd[1]: Reloading. Oct 14 05:37:15 localhost systemd-rc-local-generator[211879]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:15 localhost systemd-sysv-generator[211883]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45790 DF PROTO=TCP SPT=38230 DPT=9105 SEQ=3419366415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B073F1A0000000001030307) Oct 14 05:37:16 localhost systemd[1]: session-53.scope: Deactivated successfully. Oct 14 05:37:16 localhost systemd[1]: session-53.scope: Consumed 3min 47.917s CPU time. Oct 14 05:37:16 localhost systemd-logind[760]: Session 53 logged out. Waiting for processes to exit. Oct 14 05:37:16 localhost systemd-logind[760]: Removed session 53. Oct 14 05:37:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30896 DF PROTO=TCP SPT=42046 DPT=9101 SEQ=4212716301 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07481A0000000001030307) Oct 14 05:37:19 localhost sshd[211909]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:37:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51008 DF PROTO=TCP SPT=51966 DPT=9102 SEQ=530804543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B075CF50000000001030307) Oct 14 05:37:23 localhost sshd[211911]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:37:23 localhost systemd-logind[760]: New session 54 of user zuul. Oct 14 05:37:23 localhost systemd[1]: Started Session 54 of User zuul. Oct 14 05:37:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51009 DF PROTO=TCP SPT=51966 DPT=9102 SEQ=530804543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07611A0000000001030307) Oct 14 05:37:24 localhost python3.9[212022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:37:26 localhost python3.9[212136]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:26 localhost python3.9[212246]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:27 localhost python3.9[212356]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:28 localhost python3.9[212466]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:37:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39831 DF PROTO=TCP SPT=60426 DPT=9882 SEQ=1370904131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B076FBC0000000001030307) Oct 14 05:37:28 localhost python3.9[212576]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:29 localhost python3.9[212686]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:37:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51011 DF PROTO=TCP SPT=51966 DPT=9102 SEQ=530804543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0778DA0000000001030307) Oct 14 05:37:30 localhost python3.9[212798]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:37:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:37:31 localhost systemd[1]: Reloading. Oct 14 05:37:31 localhost podman[212800]: 2025-10-14 09:37:31.03818101 +0000 UTC m=+0.071803747 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 05:37:31 localhost podman[212800]: 2025-10-14 09:37:31.098050454 +0000 UTC m=+0.131673191 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:37:31 localhost systemd-rc-local-generator[212852]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:31 localhost systemd-sysv-generator[212856]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:31 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:37:32 localhost python3.9[212970]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:37:32 localhost network[212987]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:37:32 localhost network[212988]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:37:32 localhost network[212989]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:37:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:37:33 localhost systemd[1]: tmp-crun.Y4uIBh.mount: Deactivated successfully. Oct 14 05:37:33 localhost podman[212996]: 2025-10-14 09:37:33.118116009 +0000 UTC m=+0.092198860 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:37:33 localhost podman[212996]: 2025-10-14 09:37:33.155138896 +0000 UTC m=+0.129221827 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:37:33 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:37:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39834 DF PROTO=TCP SPT=60426 DPT=9882 SEQ=1370904131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B078B9B0000000001030307) Oct 14 05:37:37 localhost python3.9[213240]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:37:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59382 DF PROTO=TCP SPT=34552 DPT=9105 SEQ=227406702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07989E0000000001030307) Oct 14 05:37:38 localhost systemd[1]: Reloading. Oct 14 05:37:38 localhost systemd-rc-local-generator[213269]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:38 localhost systemd-sysv-generator[213274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59383 DF PROTO=TCP SPT=34552 DPT=9105 SEQ=227406702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B079C9A0000000001030307) Oct 14 05:37:40 localhost python3.9[213386]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:37:40 localhost python3.9[213496]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:37:41 localhost python3.9[213608]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59384 DF PROTO=TCP SPT=34552 DPT=9105 SEQ=227406702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07A49A0000000001030307) Oct 14 05:37:42 localhost python3.9[213718]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:43 localhost python3.9[213828]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:43 localhost python3.9[213885]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:44 localhost python3.9[213995]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:44 localhost python3.9[214052]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:45 localhost python3.9[214162]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59385 DF PROTO=TCP SPT=34552 DPT=9105 SEQ=227406702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07B45A0000000001030307) Oct 14 05:37:46 localhost python3.9[214272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:46 localhost python3.9[214329]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:47 localhost python3.9[214439]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:48 localhost python3.9[214496]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14764 DF PROTO=TCP SPT=37694 DPT=9101 SEQ=317659329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07BD1B0000000001030307) Oct 14 05:37:49 localhost python3.9[214606]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:37:49 localhost systemd[1]: Reloading. Oct 14 05:37:49 localhost systemd-sysv-generator[214635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:49 localhost systemd-rc-local-generator[214632]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:50 localhost python3.9[214755]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:51 localhost python3.9[214812]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:52 localhost python3.9[214922]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:52 localhost python3.9[214979]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:53 localhost python3.9[215089]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:37:53 localhost systemd[1]: Reloading. Oct 14 05:37:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58546 DF PROTO=TCP SPT=60140 DPT=9102 SEQ=3293457376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07D2250000000001030307) Oct 14 05:37:53 localhost systemd-rc-local-generator[215114]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:37:53 localhost systemd-sysv-generator[215118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:37:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:37:53 localhost systemd[1]: Starting Create netns directory... Oct 14 05:37:53 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:37:53 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:37:53 localhost systemd[1]: Finished Create netns directory. Oct 14 05:37:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58547 DF PROTO=TCP SPT=60140 DPT=9102 SEQ=3293457376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07D61A0000000001030307) Oct 14 05:37:54 localhost python3.9[215241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:55 localhost python3.9[215351]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:56 localhost python3.9[215439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434675.0328078-698-253108362279143/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:57 localhost python3.9[215549]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:37:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:37:57.742 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:37:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:37:57.744 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:37:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:37:57.745 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:37:57 localhost python3.9[215659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:37:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14841 DF PROTO=TCP SPT=53202 DPT=9882 SEQ=384601711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07E4EA0000000001030307) Oct 14 05:37:58 localhost python3.9[215749]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434677.4729414-772-27477508919765/.source.json _original_basename=.nin0_ft5 follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:37:59 localhost python3.9[215859]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58549 DF PROTO=TCP SPT=60140 DPT=9102 SEQ=3293457376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B07EDDA0000000001030307) Oct 14 05:38:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:38:01 localhost systemd[1]: tmp-crun.YUXDOQ.mount: Deactivated successfully. Oct 14 05:38:01 localhost podman[216058]: 2025-10-14 09:38:01.750240629 +0000 UTC m=+0.085171148 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 05:38:01 localhost podman[216058]: 2025-10-14 09:38:01.793109672 +0000 UTC m=+0.128040211 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:38:01 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:38:02 localhost python3.9[216194]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False Oct 14 05:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:38:03 localhost podman[216250]: 2025-10-14 09:38:03.733770303 +0000 UTC m=+0.075032493 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:38:03 localhost podman[216250]: 2025-10-14 09:38:03.743139755 +0000 UTC m=+0.084401945 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:38:03 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:38:04 localhost python3.9[216322]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:38:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14844 DF PROTO=TCP SPT=53202 DPT=9882 SEQ=384601711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08009A0000000001030307) Oct 14 05:38:05 localhost podman[216540]: 2025-10-14 09:38:05.805457855 +0000 UTC m=+0.095445348 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux ) Oct 14 05:38:05 localhost python3.9[216542]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:38:05 localhost podman[216540]: 2025-10-14 09:38:05.941189677 +0000 UTC m=+0.231177150 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, RELEASE=main) Oct 14 05:38:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11640 DF PROTO=TCP SPT=54700 DPT=9105 SEQ=2759824423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B080DCE0000000001030307) Oct 14 05:38:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11641 DF PROTO=TCP SPT=54700 DPT=9105 SEQ=2759824423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0811DA0000000001030307) Oct 14 05:38:10 localhost python3[216827]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:38:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11642 DF PROTO=TCP SPT=54700 DPT=9105 SEQ=2759824423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0819DA0000000001030307) Oct 14 05:38:13 localhost podman[216841]: 2025-10-14 09:38:10.531564047 +0000 UTC m=+0.031706147 image pull quay.io/podified-antelope-centos9/openstack-iscsid:current-podified Oct 14 05:38:13 localhost python3[216827]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "4f44a4f5e0315c0d3dbd533e21d0927bf0518cf452942382901ff1ff9d621cbd",#012 "Digest": "sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-iscsid:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-iscsid@sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-14T06:14:08.154480843Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 403858061,#012 "VirtualSize": 403858061,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec/diff:/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",#012 "sha256:f640179b0564dc7abbe22bd39fc8810d5bbb8e54094fe7ebc5b3c45b658c4983",#012 "sha256:f004953af60f7a99c360488169b0781a154164be09dce508bd68d57932c60f8f"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-14T06:08:54.969219151Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969253522Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969285133Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969308103Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969342284Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969363945Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:55.340499198Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:32.389605838Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:35.587912811Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which Oct 14 05:38:13 localhost podman[216903]: 2025-10-14 09:38:13.351436684 +0000 UTC m=+0.084836291 container remove 2e644299c59a59570227c48d3b11e80fd3cc404afcbbc84b2169e6b3ab689fea (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1, config_id=tripleo_step3, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2) Oct 14 05:38:13 localhost python3[216827]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force iscsid Oct 14 05:38:13 localhost podman[216917]: Oct 14 05:38:13 localhost podman[216917]: 2025-10-14 09:38:13.458234967 +0000 UTC m=+0.089731666 container create fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, tcib_managed=true, managed_by=edpm_ansible) Oct 14 05:38:13 localhost podman[216917]: 2025-10-14 09:38:13.413733675 +0000 UTC m=+0.045230404 image pull quay.io/podified-antelope-centos9/openstack-iscsid:current-podified Oct 14 05:38:13 localhost python3[216827]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified Oct 14 05:38:14 localhost python3.9[217065]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:38:15 localhost python3.9[217177]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11643 DF PROTO=TCP SPT=54700 DPT=9105 SEQ=2759824423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08299B0000000001030307) Oct 14 05:38:16 localhost python3.9[217232]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:38:16 localhost python3.9[217341]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434696.101883-1036-252371193808403/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:17 localhost python3.9[217396]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:38:17 localhost systemd[1]: Reloading. Oct 14 05:38:17 localhost systemd-sysv-generator[217425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:38:17 localhost systemd-rc-local-generator[217421]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:38:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:38:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37542 DF PROTO=TCP SPT=46472 DPT=9101 SEQ=2562127632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08325A0000000001030307) Oct 14 05:38:18 localhost python3.9[217486]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:38:18 localhost systemd[1]: Reloading. Oct 14 05:38:18 localhost systemd-sysv-generator[217519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:38:18 localhost systemd-rc-local-generator[217516]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:38:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:38:19 localhost systemd[1]: Starting iscsid container... Oct 14 05:38:19 localhost systemd[1]: Started libcrun container. Oct 14 05:38:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24/merged/etc/target supports timestamps until 2038 (0x7fffffff) Oct 14 05:38:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:38:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:38:19 localhost podman[217527]: 2025-10-14 09:38:19.202598649 +0000 UTC m=+0.118657446 container init fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS) Oct 14 05:38:19 localhost iscsid[217542]: + sudo -E kolla_set_configs Oct 14 05:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:38:19 localhost podman[217527]: 2025-10-14 09:38:19.239379347 +0000 UTC m=+0.155438114 container start fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid) Oct 14 05:38:19 localhost podman[217527]: iscsid Oct 14 05:38:19 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 05:38:19 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 05:38:19 localhost systemd[1]: Started iscsid container. Oct 14 05:38:19 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 05:38:19 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 05:38:19 localhost podman[217550]: 2025-10-14 09:38:19.314457071 +0000 UTC m=+0.075428916 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 05:38:19 localhost podman[217550]: 2025-10-14 09:38:19.352090785 +0000 UTC m=+0.113062690 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 05:38:19 localhost podman[217550]: unhealthy Oct 14 05:38:19 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:38:19 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Failed with result 'exit-code'. Oct 14 05:38:19 localhost systemd[217558]: Queued start job for default target Main User Target. Oct 14 05:38:19 localhost systemd[217558]: Created slice User Application Slice. Oct 14 05:38:19 localhost systemd[217558]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 05:38:19 localhost systemd[217558]: Started Daily Cleanup of User's Temporary Directories. Oct 14 05:38:19 localhost systemd[217558]: Reached target Paths. Oct 14 05:38:19 localhost systemd[217558]: Reached target Timers. Oct 14 05:38:19 localhost systemd[217558]: Starting D-Bus User Message Bus Socket... Oct 14 05:38:19 localhost systemd[217558]: Starting Create User's Volatile Files and Directories... Oct 14 05:38:19 localhost systemd[217558]: Listening on D-Bus User Message Bus Socket. Oct 14 05:38:19 localhost systemd[217558]: Reached target Sockets. Oct 14 05:38:19 localhost systemd[217558]: Finished Create User's Volatile Files and Directories. Oct 14 05:38:19 localhost systemd[217558]: Reached target Basic System. Oct 14 05:38:19 localhost systemd[217558]: Reached target Main User Target. Oct 14 05:38:19 localhost systemd[217558]: Startup finished in 118ms. Oct 14 05:38:19 localhost systemd[1]: Started User Manager for UID 0. Oct 14 05:38:19 localhost systemd[1]: Started Session c14 of User root. Oct 14 05:38:19 localhost iscsid[217542]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:38:19 localhost iscsid[217542]: INFO:__main__:Validating config file Oct 14 05:38:19 localhost iscsid[217542]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:38:19 localhost iscsid[217542]: INFO:__main__:Writing out command to execute Oct 14 05:38:19 localhost systemd[1]: session-c14.scope: Deactivated successfully. Oct 14 05:38:19 localhost iscsid[217542]: ++ cat /run_command Oct 14 05:38:19 localhost iscsid[217542]: + CMD='/usr/sbin/iscsid -f' Oct 14 05:38:19 localhost iscsid[217542]: + ARGS= Oct 14 05:38:19 localhost iscsid[217542]: + sudo kolla_copy_cacerts Oct 14 05:38:19 localhost systemd[1]: Started Session c15 of User root. Oct 14 05:38:19 localhost systemd[1]: session-c15.scope: Deactivated successfully. Oct 14 05:38:19 localhost iscsid[217542]: + [[ ! -n '' ]] Oct 14 05:38:19 localhost iscsid[217542]: + . kolla_extend_start Oct 14 05:38:19 localhost iscsid[217542]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]] Oct 14 05:38:19 localhost iscsid[217542]: Running command: '/usr/sbin/iscsid -f' Oct 14 05:38:19 localhost iscsid[217542]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\''' Oct 14 05:38:19 localhost iscsid[217542]: + umask 0022 Oct 14 05:38:19 localhost iscsid[217542]: + exec /usr/sbin/iscsid -f Oct 14 05:38:21 localhost python3.9[217696]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:38:22 localhost python3.9[217806]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:23 localhost python3.9[217916]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:38:23 localhost network[217933]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:38:23 localhost network[217934]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:38:23 localhost network[217935]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:38:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6948 DF PROTO=TCP SPT=46666 DPT=9102 SEQ=668247626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0847550000000001030307) Oct 14 05:38:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6949 DF PROTO=TCP SPT=46666 DPT=9102 SEQ=668247626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B084B5A0000000001030307) Oct 14 05:38:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:38:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58078 DF PROTO=TCP SPT=46370 DPT=9882 SEQ=1297104196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B085A1A0000000001030307) Oct 14 05:38:28 localhost python3.9[218168]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:38:29 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 05:38:29 localhost systemd[217558]: Activating special unit Exit the Session... Oct 14 05:38:29 localhost systemd[217558]: Stopped target Main User Target. Oct 14 05:38:29 localhost systemd[217558]: Stopped target Basic System. Oct 14 05:38:29 localhost systemd[217558]: Stopped target Paths. Oct 14 05:38:29 localhost systemd[217558]: Stopped target Sockets. Oct 14 05:38:29 localhost systemd[217558]: Stopped target Timers. Oct 14 05:38:29 localhost systemd[217558]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 05:38:29 localhost systemd[217558]: Closed D-Bus User Message Bus Socket. Oct 14 05:38:29 localhost systemd[217558]: Stopped Create User's Volatile Files and Directories. Oct 14 05:38:29 localhost systemd[217558]: Removed slice User Application Slice. Oct 14 05:38:29 localhost systemd[217558]: Reached target Shutdown. Oct 14 05:38:29 localhost systemd[217558]: Finished Exit the Session. Oct 14 05:38:29 localhost systemd[217558]: Reached target Exit the Session. Oct 14 05:38:29 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 05:38:29 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 05:38:29 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 05:38:29 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 05:38:29 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 05:38:29 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 05:38:29 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 05:38:29 localhost python3.9[218278]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Oct 14 05:38:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6951 DF PROTO=TCP SPT=46666 DPT=9102 SEQ=668247626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08631A0000000001030307) Oct 14 05:38:31 localhost python3.9[218393]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:38:31 localhost systemd[1]: tmp-crun.Md0X9z.mount: Deactivated successfully. Oct 14 05:38:31 localhost podman[218481]: 2025-10-14 09:38:31.978162026 +0000 UTC m=+0.069438276 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:38:32 localhost podman[218481]: 2025-10-14 09:38:32.044254168 +0000 UTC m=+0.135530428 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:38:32 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:38:32 localhost python3.9[218482]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434710.0511444-1258-207415798376785/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:32 localhost python3.9[218617]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:33 localhost python3.9[218727]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:38:33 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 14 05:38:33 localhost systemd[1]: Stopped Load Kernel Modules. Oct 14 05:38:33 localhost systemd[1]: Stopping Load Kernel Modules... Oct 14 05:38:33 localhost systemd[1]: Starting Load Kernel Modules... Oct 14 05:38:33 localhost systemd-modules-load[218742]: Module 'msr' is built in Oct 14 05:38:33 localhost systemd[1]: Finished Load Kernel Modules. Oct 14 05:38:33 localhost podman[218729]: 2025-10-14 09:38:33.974515748 +0000 UTC m=+0.085297478 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Oct 14 05:38:34 localhost podman[218729]: 2025-10-14 09:38:34.009118998 +0000 UTC m=+0.119900717 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:38:34 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:38:34 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Oct 14 05:38:34 localhost python3.9[218860]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:38:35 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Oct 14 05:38:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58081 DF PROTO=TCP SPT=46370 DPT=9882 SEQ=1297104196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0875DA0000000001030307) Oct 14 05:38:35 localhost python3.9[218971]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:38:36 localhost python3.9[219081]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:38:37 localhost python3.9[219191]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:37 localhost python3.9[219279]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434716.573032-1433-96864603670557/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:38 localhost python3.9[219389]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:38:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18833 DF PROTO=TCP SPT=43656 DPT=9105 SEQ=949674638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0882FF0000000001030307) Oct 14 05:38:39 localhost python3.9[219500]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18834 DF PROTO=TCP SPT=43656 DPT=9105 SEQ=949674638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08871A0000000001030307) Oct 14 05:38:40 localhost python3.9[219610]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:40 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Oct 14 05:38:40 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:38:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:38:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:38:40 localhost python3.9[219721]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:41 localhost python3.9[219831]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18835 DF PROTO=TCP SPT=43656 DPT=9105 SEQ=949674638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B088F1B0000000001030307) Oct 14 05:38:42 localhost python3.9[219941]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:42 localhost python3.9[220051]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:43 localhost python3.9[220161]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:44 localhost python3.9[220271]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:38:45 localhost python3.9[220383]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:45 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Oct 14 05:38:45 localhost python3.9[220494]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:38:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18836 DF PROTO=TCP SPT=43656 DPT=9105 SEQ=949674638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B089EDA0000000001030307) Oct 14 05:38:46 localhost python3.9[220604]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:47 localhost python3.9[220661]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:38:47 localhost python3.9[220771]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:48 localhost python3.9[220828]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:38:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42069 DF PROTO=TCP SPT=50734 DPT=9101 SEQ=1107452387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08A79A0000000001030307) Oct 14 05:38:48 localhost python3.9[220938]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:38:49 localhost podman[221049]: 2025-10-14 09:38:49.518340111 +0000 UTC m=+0.079757932 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:38:49 localhost podman[221049]: 2025-10-14 09:38:49.525721183 +0000 UTC m=+0.087139024 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2) Oct 14 05:38:49 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:38:49 localhost python3.9[221048]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:51 localhost python3.9[221122]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:51 localhost python3.9[221232]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:52 localhost python3.9[221289]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55295 DF PROTO=TCP SPT=58360 DPT=9102 SEQ=1369929549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08BC840000000001030307) Oct 14 05:38:53 localhost python3.9[221399]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:38:53 localhost systemd[1]: Reloading. Oct 14 05:38:53 localhost systemd-rc-local-generator[221426]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:38:53 localhost systemd-sysv-generator[221430]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:38:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:38:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55296 DF PROTO=TCP SPT=58360 DPT=9102 SEQ=1369929549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08C09A0000000001030307) Oct 14 05:38:54 localhost python3.9[221546]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:55 localhost python3.9[221603]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:55 localhost python3.9[221713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:38:56 localhost python3.9[221770]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:38:57 localhost python3.9[221880]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:38:57 localhost systemd[1]: Reloading. Oct 14 05:38:57 localhost systemd-rc-local-generator[221904]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:38:57 localhost systemd-sysv-generator[221911]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:38:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:38:57 localhost systemd[1]: Starting Create netns directory... Oct 14 05:38:57 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:38:57 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:38:57 localhost systemd[1]: Finished Create netns directory. Oct 14 05:38:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:38:57.744 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:38:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:38:57.746 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:38:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:38:57.747 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:38:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40455 DF PROTO=TCP SPT=37574 DPT=9882 SEQ=1877320727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08CF4C0000000001030307) Oct 14 05:38:59 localhost python3.9[222032]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:39:00 localhost python3.9[222142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:39:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55298 DF PROTO=TCP SPT=58360 DPT=9102 SEQ=1369929549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08D85A0000000001030307) Oct 14 05:39:00 localhost python3.9[222230]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434739.8011487-2054-98734362484216/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:39:01 localhost python3.9[222340]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:39:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:39:02 localhost systemd[1]: tmp-crun.LoZuFo.mount: Deactivated successfully. Oct 14 05:39:02 localhost podman[222451]: 2025-10-14 09:39:02.609537166 +0000 UTC m=+0.100657220 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 05:39:02 localhost podman[222451]: 2025-10-14 09:39:02.657142605 +0000 UTC m=+0.148262659 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:39:02 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:39:02 localhost python3.9[222450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:39:03 localhost python3.9[222564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434742.2295647-2128-198327978512477/.source.json _original_basename=.dg5df7ie follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:04 localhost python3.9[222674]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:39:04 localhost systemd[1]: tmp-crun.E0OVYr.mount: Deactivated successfully. Oct 14 05:39:04 localhost podman[222785]: 2025-10-14 09:39:04.699244947 +0000 UTC m=+0.083803200 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Oct 14 05:39:04 localhost podman[222785]: 2025-10-14 09:39:04.734017691 +0000 UTC m=+0.118575944 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 05:39:04 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:39:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40458 DF PROTO=TCP SPT=37574 DPT=9882 SEQ=1877320727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08EB1A0000000001030307) Oct 14 05:39:07 localhost python3.9[223000]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Oct 14 05:39:08 localhost python3.9[223177]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:39:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28995 DF PROTO=TCP SPT=33298 DPT=9105 SEQ=1349144027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08F82E0000000001030307) Oct 14 05:39:09 localhost python3.9[223287]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:39:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28996 DF PROTO=TCP SPT=33298 DPT=9105 SEQ=1349144027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B08FC1B0000000001030307) Oct 14 05:39:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28997 DF PROTO=TCP SPT=33298 DPT=9105 SEQ=1349144027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09041A0000000001030307) Oct 14 05:39:13 localhost python3[223442]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:39:14 localhost podman[223456]: 2025-10-14 09:39:13.371470457 +0000 UTC m=+0.027379284 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Oct 14 05:39:15 localhost podman[223503]: Oct 14 05:39:15 localhost podman[223503]: 2025-10-14 09:39:15.104882211 +0000 UTC m=+0.073253807 container create 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 05:39:15 localhost podman[223503]: 2025-10-14 09:39:15.076324602 +0000 UTC m=+0.044696218 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Oct 14 05:39:15 localhost python3[223442]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Oct 14 05:39:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28998 DF PROTO=TCP SPT=33298 DPT=9105 SEQ=1349144027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0913DA0000000001030307) Oct 14 05:39:16 localhost python3.9[223651]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:39:17 localhost python3.9[223763]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2624 DF PROTO=TCP SPT=38902 DPT=9101 SEQ=3693500446 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B091CDB0000000001030307) Oct 14 05:39:18 localhost python3.9[223818]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:39:19 localhost python3.9[223927]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434758.5456266-2392-244333312380091/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:39:19 localhost podman[223928]: 2025-10-14 09:39:19.733115603 +0000 UTC m=+0.074009706 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:39:19 localhost podman[223928]: 2025-10-14 09:39:19.74610865 +0000 UTC m=+0.087002763 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, container_name=iscsid, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:39:19 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:39:20 localhost python3.9[224000]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:39:20 localhost systemd[1]: Reloading. Oct 14 05:39:20 localhost systemd-sysv-generator[224026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:39:20 localhost systemd-rc-local-generator[224023]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:39:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:21 localhost python3.9[224091]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:39:21 localhost systemd[1]: Reloading. Oct 14 05:39:21 localhost systemd-sysv-generator[224119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:39:21 localhost systemd-rc-local-generator[224115]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:39:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:21 localhost systemd[1]: Starting multipathd container... Oct 14 05:39:21 localhost systemd[1]: Started libcrun container. Oct 14 05:39:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76c6be5611369003add7cfad6d47367448cfde14b22e1e37ab3b6a1f703828b2/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Oct 14 05:39:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76c6be5611369003add7cfad6d47367448cfde14b22e1e37ab3b6a1f703828b2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:39:21 localhost podman[224131]: 2025-10-14 09:39:21.976419402 +0000 UTC m=+0.112400437 container init 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:39:21 localhost multipathd[224146]: + sudo -E kolla_set_configs Oct 14 05:39:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:39:22 localhost podman[224131]: 2025-10-14 09:39:22.00752328 +0000 UTC m=+0.143504325 container start 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:39:22 localhost podman[224131]: multipathd Oct 14 05:39:22 localhost systemd[1]: Started multipathd container. Oct 14 05:39:22 localhost multipathd[224146]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:39:22 localhost multipathd[224146]: INFO:__main__:Validating config file Oct 14 05:39:22 localhost multipathd[224146]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:39:22 localhost multipathd[224146]: INFO:__main__:Writing out command to execute Oct 14 05:39:22 localhost multipathd[224146]: ++ cat /run_command Oct 14 05:39:22 localhost multipathd[224146]: + CMD='/usr/sbin/multipathd -d' Oct 14 05:39:22 localhost multipathd[224146]: + ARGS= Oct 14 05:39:22 localhost multipathd[224146]: + sudo kolla_copy_cacerts Oct 14 05:39:22 localhost podman[224155]: 2025-10-14 09:39:22.064289671 +0000 UTC m=+0.055577384 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:39:22 localhost multipathd[224146]: + [[ ! -n '' ]] Oct 14 05:39:22 localhost multipathd[224146]: + . kolla_extend_start Oct 14 05:39:22 localhost multipathd[224146]: Running command: '/usr/sbin/multipathd -d' Oct 14 05:39:22 localhost multipathd[224146]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Oct 14 05:39:22 localhost multipathd[224146]: + umask 0022 Oct 14 05:39:22 localhost multipathd[224146]: + exec /usr/sbin/multipathd -d Oct 14 05:39:22 localhost multipathd[224146]: 10778.266983 | --------start up-------- Oct 14 05:39:22 localhost multipathd[224146]: 10778.267003 | read /etc/multipath.conf Oct 14 05:39:22 localhost multipathd[224146]: 10778.269790 | path checkers start up Oct 14 05:39:22 localhost podman[224155]: 2025-10-14 09:39:22.081963649 +0000 UTC m=+0.073251382 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:39:22 localhost podman[224155]: unhealthy Oct 14 05:39:22 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:39:22 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Failed with result 'exit-code'. Oct 14 05:39:22 localhost python3.9[224293]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:39:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32504 DF PROTO=TCP SPT=33546 DPT=9102 SEQ=1386250759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0931B50000000001030307) Oct 14 05:39:23 localhost python3.9[224405]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:39:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32505 DF PROTO=TCP SPT=33546 DPT=9102 SEQ=1386250759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0935DA0000000001030307) Oct 14 05:39:24 localhost python3.9[224528]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:39:24 localhost systemd[1]: Stopping multipathd container... Oct 14 05:39:24 localhost systemd[1]: tmp-crun.7S2Tu4.mount: Deactivated successfully. Oct 14 05:39:24 localhost multipathd[224146]: 10781.053899 | exit (signal) Oct 14 05:39:24 localhost multipathd[224146]: 10781.053956 | --------shut down------- Oct 14 05:39:24 localhost systemd[1]: libpod-02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.scope: Deactivated successfully. Oct 14 05:39:24 localhost podman[224532]: 2025-10-14 09:39:24.892511028 +0000 UTC m=+0.096316943 container died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd) Oct 14 05:39:24 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.timer: Deactivated successfully. Oct 14 05:39:24 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:39:24 localhost systemd[1]: tmp-crun.2hSBoB.mount: Deactivated successfully. Oct 14 05:39:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad-userdata-shm.mount: Deactivated successfully. Oct 14 05:39:25 localhost podman[224532]: 2025-10-14 09:39:25.037550492 +0000 UTC m=+0.241356376 container cleanup 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd) Oct 14 05:39:25 localhost podman[224532]: multipathd Oct 14 05:39:25 localhost podman[224558]: 2025-10-14 09:39:25.11850774 +0000 UTC m=+0.050160641 container cleanup 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:39:25 localhost podman[224558]: multipathd Oct 14 05:39:25 localhost systemd[1]: edpm_multipathd.service: Deactivated successfully. Oct 14 05:39:25 localhost systemd[1]: Stopped multipathd container. Oct 14 05:39:25 localhost systemd[1]: Starting multipathd container... Oct 14 05:39:25 localhost systemd[1]: Started libcrun container. Oct 14 05:39:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76c6be5611369003add7cfad6d47367448cfde14b22e1e37ab3b6a1f703828b2/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Oct 14 05:39:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76c6be5611369003add7cfad6d47367448cfde14b22e1e37ab3b6a1f703828b2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:39:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:39:25 localhost podman[224570]: 2025-10-14 09:39:25.287053698 +0000 UTC m=+0.140625023 container init 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Oct 14 05:39:25 localhost multipathd[224585]: + sudo -E kolla_set_configs Oct 14 05:39:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:39:25 localhost podman[224570]: 2025-10-14 09:39:25.324459158 +0000 UTC m=+0.178030503 container start 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:39:25 localhost podman[224570]: multipathd Oct 14 05:39:25 localhost systemd[1]: Started multipathd container. Oct 14 05:39:25 localhost multipathd[224585]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:39:25 localhost multipathd[224585]: INFO:__main__:Validating config file Oct 14 05:39:25 localhost multipathd[224585]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:39:25 localhost multipathd[224585]: INFO:__main__:Writing out command to execute Oct 14 05:39:25 localhost multipathd[224585]: ++ cat /run_command Oct 14 05:39:25 localhost multipathd[224585]: + CMD='/usr/sbin/multipathd -d' Oct 14 05:39:25 localhost multipathd[224585]: + ARGS= Oct 14 05:39:25 localhost multipathd[224585]: + sudo kolla_copy_cacerts Oct 14 05:39:25 localhost multipathd[224585]: + [[ ! -n '' ]] Oct 14 05:39:25 localhost multipathd[224585]: + . kolla_extend_start Oct 14 05:39:25 localhost multipathd[224585]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Oct 14 05:39:25 localhost multipathd[224585]: Running command: '/usr/sbin/multipathd -d' Oct 14 05:39:25 localhost multipathd[224585]: + umask 0022 Oct 14 05:39:25 localhost multipathd[224585]: + exec /usr/sbin/multipathd -d Oct 14 05:39:25 localhost podman[224593]: 2025-10-14 09:39:25.420962375 +0000 UTC m=+0.090013510 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 05:39:25 localhost multipathd[224585]: 10781.615865 | --------start up-------- Oct 14 05:39:25 localhost multipathd[224585]: 10781.615886 | read /etc/multipath.conf Oct 14 05:39:25 localhost multipathd[224585]: 10781.620087 | path checkers start up Oct 14 05:39:25 localhost podman[224593]: 2025-10-14 09:39:25.439914952 +0000 UTC m=+0.108966077 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, container_name=multipathd) Oct 14 05:39:25 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:39:26 localhost python3.9[224734]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:27 localhost python3.9[224844]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:39:27 localhost python3.9[224954]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Oct 14 05:39:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61735 DF PROTO=TCP SPT=36976 DPT=9882 SEQ=2206311522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09447A0000000001030307) Oct 14 05:39:28 localhost python3.9[225072]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:39:29 localhost python3.9[225160]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434768.1149218-2632-10977527510052/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32507 DF PROTO=TCP SPT=33546 DPT=9102 SEQ=1386250759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B094D9A0000000001030307) Oct 14 05:39:30 localhost python3.9[225270]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:31 localhost python3.9[225380]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:39:31 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 14 05:39:31 localhost systemd[1]: Stopped Load Kernel Modules. Oct 14 05:39:31 localhost systemd[1]: Stopping Load Kernel Modules... Oct 14 05:39:31 localhost systemd[1]: Starting Load Kernel Modules... Oct 14 05:39:31 localhost systemd-modules-load[225384]: Module 'msr' is built in Oct 14 05:39:31 localhost systemd[1]: Finished Load Kernel Modules. Oct 14 05:39:32 localhost python3.9[225494]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:39:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:39:33 localhost podman[225555]: 2025-10-14 09:39:33.771307486 +0000 UTC m=+0.099666059 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:39:33 localhost podman[225555]: 2025-10-14 09:39:33.811280499 +0000 UTC m=+0.139639042 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:39:33 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:39:33 localhost python3.9[225567]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:39:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61738 DF PROTO=TCP SPT=36976 DPT=9882 SEQ=2206311522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09605A0000000001030307) Oct 14 05:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:39:35 localhost podman[225584]: 2025-10-14 09:39:35.744002052 +0000 UTC m=+0.083057465 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:39:35 localhost podman[225584]: 2025-10-14 09:39:35.754081126 +0000 UTC m=+0.093136539 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Oct 14 05:39:35 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:39:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21197 DF PROTO=TCP SPT=39964 DPT=9105 SEQ=385786622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B096D5F0000000001030307) Oct 14 05:39:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21198 DF PROTO=TCP SPT=39964 DPT=9105 SEQ=385786622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09715A0000000001030307) Oct 14 05:39:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21199 DF PROTO=TCP SPT=39964 DPT=9105 SEQ=385786622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09795A0000000001030307) Oct 14 05:39:41 localhost systemd[1]: Reloading. Oct 14 05:39:41 localhost systemd-sysv-generator[225634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:39:41 localhost systemd-rc-local-generator[225631]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:39:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:42 localhost systemd[1]: Reloading. Oct 14 05:39:42 localhost systemd-rc-local-generator[225667]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:39:42 localhost systemd-sysv-generator[225674]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:39:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:42 localhost systemd-logind[760]: Watching system buttons on /dev/input/event0 (Power Button) Oct 14 05:39:42 localhost systemd-logind[760]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Oct 14 05:39:42 localhost lvm[225719]: PV /dev/loop4 online, VG ceph_vg1 is complete. Oct 14 05:39:42 localhost lvm[225719]: VG ceph_vg1 finished Oct 14 05:39:42 localhost lvm[225718]: PV /dev/loop3 online, VG ceph_vg0 is complete. Oct 14 05:39:42 localhost lvm[225718]: VG ceph_vg0 finished Oct 14 05:39:42 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Oct 14 05:39:42 localhost systemd[1]: Starting man-db-cache-update.service... Oct 14 05:39:42 localhost systemd[1]: Reloading. Oct 14 05:39:42 localhost systemd-rc-local-generator[225767]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:39:42 localhost systemd-sysv-generator[225770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:39:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:43 localhost systemd[1]: Queuing reload/restart jobs for marked units… Oct 14 05:39:43 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Oct 14 05:39:43 localhost systemd[1]: Finished man-db-cache-update.service. Oct 14 05:39:43 localhost systemd[1]: man-db-cache-update.service: Consumed 1.242s CPU time. Oct 14 05:39:43 localhost systemd[1]: run-r6fb7a78bbe69440187e5743644abeebb.service: Deactivated successfully. Oct 14 05:39:45 localhost python3.9[227015]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21200 DF PROTO=TCP SPT=39964 DPT=9105 SEQ=385786622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09891A0000000001030307) Oct 14 05:39:46 localhost python3.9[227123]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:39:47 localhost python3.9[227237]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:39:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38579 DF PROTO=TCP SPT=42460 DPT=9101 SEQ=2623369605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0991DA0000000001030307) Oct 14 05:39:49 localhost python3.9[227347]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:39:49 localhost systemd[1]: Reloading. Oct 14 05:39:49 localhost systemd-sysv-generator[227377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:39:49 localhost systemd-rc-local-generator[227374]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:39:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:50 localhost python3.9[227490]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:39:50 localhost network[227507]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:39:50 localhost network[227508]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:39:50 localhost network[227509]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:39:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:39:50 localhost podman[227515]: 2025-10-14 09:39:50.672015928 +0000 UTC m=+0.083957065 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251009, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:39:50 localhost podman[227515]: 2025-10-14 09:39:50.684035244 +0000 UTC m=+0.095976381 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid) Oct 14 05:39:51 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:39:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:39:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37887 DF PROTO=TCP SPT=59474 DPT=9102 SEQ=2985983419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09A6E50000000001030307) Oct 14 05:39:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37888 DF PROTO=TCP SPT=59474 DPT=9102 SEQ=2985983419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09AADA0000000001030307) Oct 14 05:39:55 localhost python3.9[227762]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:39:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:39:55 localhost podman[227764]: 2025-10-14 09:39:55.649342562 +0000 UTC m=+0.059719827 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:39:55 localhost podman[227764]: 2025-10-14 09:39:55.65987741 +0000 UTC m=+0.070254665 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd) Oct 14 05:39:55 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:39:56 localhost python3.9[227892]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:39:57 localhost python3.9[228003]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:39:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:39:57.744 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:39:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:39:57.745 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:39:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:39:57.746 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:39:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50606 DF PROTO=TCP SPT=38860 DPT=9882 SEQ=3394693823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09B9AB0000000001030307) Oct 14 05:39:58 localhost python3.9[228114]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:39:59 localhost python3.9[228225]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:40:00 localhost python3.9[228336]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:40:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37890 DF PROTO=TCP SPT=59474 DPT=9102 SEQ=2985983419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09C29A0000000001030307) Oct 14 05:40:00 localhost python3.9[228447]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:40:01 localhost python3.9[228558]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:40:03 localhost sshd[228577]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:40:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:40:04 localhost systemd[1]: tmp-crun.7Nr2S5.mount: Deactivated successfully. Oct 14 05:40:04 localhost podman[228579]: 2025-10-14 09:40:04.461665847 +0000 UTC m=+0.091245169 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:40:04 localhost podman[228579]: 2025-10-14 09:40:04.503084305 +0000 UTC m=+0.132663677 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Oct 14 05:40:04 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:40:05 localhost python3.9[228697]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50609 DF PROTO=TCP SPT=38860 DPT=9882 SEQ=3394693823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09D55A0000000001030307) Oct 14 05:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:40:05 localhost podman[228808]: 2025-10-14 09:40:05.911418384 +0000 UTC m=+0.096351223 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:40:05 localhost podman[228808]: 2025-10-14 09:40:05.916495867 +0000 UTC m=+0.101428606 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:40:05 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:40:06 localhost python3.9[228807]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:06 localhost python3.9[228935]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:07 localhost python3.9[229045]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:07 localhost python3.9[229155]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:08 localhost python3.9[229265]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22407 DF PROTO=TCP SPT=42756 DPT=9105 SEQ=3949573316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09E28E0000000001030307) Oct 14 05:40:09 localhost python3.9[229375]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:09 localhost python3.9[229519]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22408 DF PROTO=TCP SPT=42756 DPT=9105 SEQ=3949573316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09E69A0000000001030307) Oct 14 05:40:11 localhost python3.9[229680]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:11 localhost python3.9[229790]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22409 DF PROTO=TCP SPT=42756 DPT=9105 SEQ=3949573316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09EE9A0000000001030307) Oct 14 05:40:12 localhost python3.9[229900]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:13 localhost python3.9[230010]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:14 localhost python3.9[230120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:14 localhost python3.9[230230]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:15 localhost python3.9[230340]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:15 localhost python3.9[230450]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22410 DF PROTO=TCP SPT=42756 DPT=9105 SEQ=3949573316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B09FE5A0000000001030307) Oct 14 05:40:17 localhost python3.9[230560]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:17 localhost python3.9[230670]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:40:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32972 DF PROTO=TCP SPT=54686 DPT=9101 SEQ=3788472641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A071A0000000001030307) Oct 14 05:40:18 localhost python3.9[230780]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:40:18 localhost systemd[1]: Reloading. Oct 14 05:40:19 localhost systemd-rc-local-generator[230801]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:40:19 localhost systemd-sysv-generator[230804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:40:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:40:19 localhost python3.9[230925]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:20 localhost python3.9[231036]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:21 localhost python3.9[231147]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:40:21 localhost podman[231258]: 2025-10-14 09:40:21.713712175 +0000 UTC m=+0.085045743 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 05:40:21 localhost podman[231258]: 2025-10-14 09:40:21.724121288 +0000 UTC m=+0.095454846 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 05:40:21 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:40:21 localhost python3.9[231259]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:22 localhost python3.9[231389]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17212 DF PROTO=TCP SPT=42272 DPT=9102 SEQ=62740474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A1C150000000001030307) Oct 14 05:40:24 localhost python3.9[231500]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17213 DF PROTO=TCP SPT=42272 DPT=9102 SEQ=62740474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A201A0000000001030307) Oct 14 05:40:24 localhost python3.9[231611]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:25 localhost python3.9[231722]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:40:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:40:26 localhost podman[231728]: 2025-10-14 09:40:26.563126633 +0000 UTC m=+0.063512841 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Oct 14 05:40:26 localhost podman[231728]: 2025-10-14 09:40:26.572939048 +0000 UTC m=+0.073325266 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:40:26 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:40:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5490 DF PROTO=TCP SPT=41200 DPT=9882 SEQ=4029959749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A2EDB0000000001030307) Oct 14 05:40:28 localhost python3.9[231853]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:29 localhost python3.9[231963]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:30 localhost python3.9[232073]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17215 DF PROTO=TCP SPT=42272 DPT=9102 SEQ=62740474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A37DA0000000001030307) Oct 14 05:40:30 localhost python3.9[232183]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:31 localhost python3.9[232293]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:31 localhost python3.9[232403]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:32 localhost python3.9[232513]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:32 localhost python3.9[232623]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:33 localhost python3.9[232733]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:34 localhost python3.9[232843]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:40:34 localhost python3.9[232953]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:34 localhost systemd[1]: tmp-crun.svD47M.mount: Deactivated successfully. Oct 14 05:40:34 localhost podman[232954]: 2025-10-14 09:40:34.738084334 +0000 UTC m=+0.078704987 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:40:34 localhost podman[232954]: 2025-10-14 09:40:34.825920685 +0000 UTC m=+0.166541308 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 05:40:34 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:40:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5493 DF PROTO=TCP SPT=41200 DPT=9882 SEQ=4029959749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A4A9A0000000001030307) Oct 14 05:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:40:36 localhost systemd[1]: tmp-crun.KsWZtl.mount: Deactivated successfully. Oct 14 05:40:36 localhost podman[233088]: 2025-10-14 09:40:36.183977525 +0000 UTC m=+0.085968185 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Oct 14 05:40:36 localhost podman[233088]: 2025-10-14 09:40:36.189051558 +0000 UTC m=+0.091042208 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:40:36 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:40:36 localhost python3.9[233087]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52380 DF PROTO=TCP SPT=59562 DPT=9105 SEQ=4074252119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A57BE0000000001030307) Oct 14 05:40:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52381 DF PROTO=TCP SPT=59562 DPT=9105 SEQ=4074252119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A5BDA0000000001030307) Oct 14 05:40:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52382 DF PROTO=TCP SPT=59562 DPT=9105 SEQ=4074252119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A63DA0000000001030307) Oct 14 05:40:43 localhost python3.9[233215]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Oct 14 05:40:44 localhost python3.9[233326]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Oct 14 05:40:45 localhost python3.9[233442]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486733.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Oct 14 05:40:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52383 DF PROTO=TCP SPT=59562 DPT=9105 SEQ=4074252119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A739B0000000001030307) Oct 14 05:40:46 localhost sshd[233468]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:40:46 localhost systemd-logind[760]: New session 56 of user zuul. Oct 14 05:40:46 localhost systemd[1]: Started Session 56 of User zuul. Oct 14 05:40:46 localhost systemd[1]: session-56.scope: Deactivated successfully. Oct 14 05:40:46 localhost systemd-logind[760]: Session 56 logged out. Waiting for processes to exit. Oct 14 05:40:46 localhost systemd-logind[760]: Removed session 56. Oct 14 05:40:47 localhost python3.9[233579]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:40:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12026 DF PROTO=TCP SPT=39058 DPT=9101 SEQ=2808759224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A7C5B0000000001030307) Oct 14 05:40:48 localhost python3.9[233665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434847.2068775-4270-210686046954456/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:48 localhost python3.9[233773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:40:49 localhost python3.9[233828]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:49 localhost python3.9[233936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:40:50 localhost python3.9[234022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434849.258708-4270-223071901125476/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:51 localhost python3.9[234130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:40:51 localhost python3.9[234216]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434850.9703515-4270-262085375976180/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=0ec8d5fb830c2e963175e9158df8fb7429fe888d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:40:52 localhost systemd[1]: tmp-crun.VhSaid.mount: Deactivated successfully. Oct 14 05:40:52 localhost podman[234288]: 2025-10-14 09:40:52.726333122 +0000 UTC m=+0.069188401 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2) Oct 14 05:40:52 localhost podman[234288]: 2025-10-14 09:40:52.737403555 +0000 UTC m=+0.080258794 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 05:40:52 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:40:53 localhost python3.9[234343]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:40:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17841 DF PROTO=TCP SPT=52716 DPT=9102 SEQ=866431589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A91450000000001030307) Oct 14 05:40:53 localhost python3.9[234429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434852.1357088-4270-151939780562243/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:40:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17842 DF PROTO=TCP SPT=52716 DPT=9102 SEQ=866431589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0A955B0000000001030307) Oct 14 05:40:55 localhost python3.9[234539]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:56 localhost python3.9[234649]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:40:56 localhost python3.9[234759]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:40:56 localhost podman[234760]: 2025-10-14 09:40:56.70359605 +0000 UTC m=+0.052318044 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:40:56 localhost podman[234760]: 2025-10-14 09:40:56.715172998 +0000 UTC m=+0.063895002 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:40:56 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:40:57 localhost python3.9[234890]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:40:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:40:57.745 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:40:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:40:57.746 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:40:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:40:57.747 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:40:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28851 DF PROTO=TCP SPT=34882 DPT=9882 SEQ=3452394144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0AA40B0000000001030307) Oct 14 05:40:58 localhost python3.9[234998]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:40:59 localhost python3.9[235108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:40:59 localhost python3.9[235194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434858.6326845-4603-109733374724552/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:41:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17844 DF PROTO=TCP SPT=52716 DPT=9102 SEQ=866431589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0AAD1A0000000001030307) Oct 14 05:41:00 localhost python3.9[235302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:41:01 localhost python3.9[235388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434860.1388803-4649-3801587671435/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:41:02 localhost python3.9[235498]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Oct 14 05:41:02 localhost python3.9[235608]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:41:04 localhost python3[235718]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:41:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28854 DF PROTO=TCP SPT=34882 DPT=9882 SEQ=3452394144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0ABFDA0000000001030307) Oct 14 05:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:41:05 localhost podman[235744]: 2025-10-14 09:41:05.736823085 +0000 UTC m=+0.077688027 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller) Oct 14 05:41:05 localhost podman[235744]: 2025-10-14 09:41:05.83278984 +0000 UTC m=+0.173654772 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:41:05 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:41:06 localhost podman[235770]: 2025-10-14 09:41:06.719869811 +0000 UTC m=+0.067441419 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent) Oct 14 05:41:06 localhost podman[235770]: 2025-10-14 09:41:06.752081598 +0000 UTC m=+0.099653226 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 05:41:06 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:41:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43884 DF PROTO=TCP SPT=51322 DPT=9105 SEQ=2882698617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0ACCEE0000000001030307) Oct 14 05:41:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43885 DF PROTO=TCP SPT=51322 DPT=9105 SEQ=2882698617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0AD0DA0000000001030307) Oct 14 05:41:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43886 DF PROTO=TCP SPT=51322 DPT=9105 SEQ=2882698617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0AD8DB0000000001030307) Oct 14 05:41:13 localhost podman[235731]: 2025-10-14 09:41:04.468225835 +0000 UTC m=+0.042565241 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Oct 14 05:41:14 localhost podman[235898]: Oct 14 05:41:14 localhost podman[235898]: 2025-10-14 09:41:14.13744859 +0000 UTC m=+0.051994463 container create 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=nova_compute_init, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0) Oct 14 05:41:14 localhost podman[235898]: 2025-10-14 09:41:14.114216492 +0000 UTC m=+0.028762385 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Oct 14 05:41:14 localhost python3[235718]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Oct 14 05:41:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43887 DF PROTO=TCP SPT=51322 DPT=9105 SEQ=2882698617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0AE89A0000000001030307) Oct 14 05:41:16 localhost python3.9[236069]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:41:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41505 DF PROTO=TCP SPT=50216 DPT=9101 SEQ=3502383415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0AF19A0000000001030307) Oct 14 05:41:18 localhost python3.9[236181]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Oct 14 05:41:19 localhost python3.9[236291]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:41:21 localhost python3[236401]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:41:21 localhost python3[236401]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "b5b57d3572ac74b7c41332c066527d5039dbd47e134e43d7cb5d76b7732d99f5",#012 "Digest": "sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-13T12:50:19.385564198Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1207014273,#012 "VirtualSize": 1207014273,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",#012 "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",#012 "sha256:e0ba9b00dd1340fa4eba9e9cd5f316c11381d47a31460e5b834a6ca56f60033f",#012 "sha256:731e9354c974a424a2f6724faa85f84baef270eb006be0de18bbdc87ff420f97"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-13T12:28:42.843286399Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843354051Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843394192Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843417133Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843442193Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843461914Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:43.236856724Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:29:17.539596691Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Oct 14 05:41:21 localhost podman[236452]: 2025-10-14 09:41:21.389543297 +0000 UTC m=+0.084969915 container remove ec629309e953c690b6ab5976cacbe5657f9b5cb392f4eb76057feec6a3b7c0a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '49c4309af9a4fea3d3f53b6222780f5a-4d186a6228facd5bcddf9bcc145eb470'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-nova-compute) Oct 14 05:41:21 localhost python3[236401]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Oct 14 05:41:21 localhost podman[236465]: Oct 14 05:41:21 localhost podman[236465]: 2025-10-14 09:41:21.506089711 +0000 UTC m=+0.098405459 container create b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251009, managed_by=edpm_ansible, io.buildah.version=1.41.3) Oct 14 05:41:21 localhost podman[236465]: 2025-10-14 09:41:21.445399556 +0000 UTC m=+0.037715334 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Oct 14 05:41:21 localhost python3[236401]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Oct 14 05:41:22 localhost python3.9[236611]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:41:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:41:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11509 DF PROTO=TCP SPT=33168 DPT=9102 SEQ=2071518500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B06750000000001030307) Oct 14 05:41:23 localhost systemd[1]: tmp-crun.DTB0rp.mount: Deactivated successfully. Oct 14 05:41:23 localhost podman[236723]: 2025-10-14 09:41:23.55390586 +0000 UTC m=+0.092422779 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:41:23 localhost podman[236723]: 2025-10-14 09:41:23.564920401 +0000 UTC m=+0.103437320 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:41:23 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:41:23 localhost python3.9[236724]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:41:24 localhost python3.9[236851]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434883.737679-4923-248238471718398/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:41:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11510 DF PROTO=TCP SPT=33168 DPT=9102 SEQ=2071518500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B0A9A0000000001030307) Oct 14 05:41:25 localhost python3.9[236906]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:41:25 localhost systemd[1]: Reloading. Oct 14 05:41:25 localhost systemd-rc-local-generator[236931]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:41:25 localhost systemd-sysv-generator[236934]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:41:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:41:26 localhost python3.9[236997]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:41:26 localhost systemd[1]: Reloading. Oct 14 05:41:26 localhost systemd-rc-local-generator[237026]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:41:26 localhost systemd-sysv-generator[237031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:41:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:41:26 localhost systemd[1]: Starting nova_compute container... Oct 14 05:41:26 localhost systemd[1]: Started libcrun container. Oct 14 05:41:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:26 localhost podman[237038]: 2025-10-14 09:41:26.614345068 +0000 UTC m=+0.127184603 container init b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=nova_compute, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 05:41:26 localhost podman[237038]: 2025-10-14 09:41:26.621554129 +0000 UTC m=+0.134393704 container start b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.build-date=20251009) Oct 14 05:41:26 localhost podman[237038]: nova_compute Oct 14 05:41:26 localhost nova_compute[237052]: + sudo -E kolla_set_configs Oct 14 05:41:26 localhost systemd[1]: Started nova_compute container. Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Validating config file Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying service configuration files Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Deleting /etc/nova/nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/nova/nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Deleting /etc/ceph Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Creating directory /etc/ceph Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/ceph Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Writing out command to execute Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:26 localhost nova_compute[237052]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Oct 14 05:41:26 localhost nova_compute[237052]: ++ cat /run_command Oct 14 05:41:26 localhost nova_compute[237052]: + CMD=nova-compute Oct 14 05:41:26 localhost nova_compute[237052]: + ARGS= Oct 14 05:41:26 localhost nova_compute[237052]: + sudo kolla_copy_cacerts Oct 14 05:41:26 localhost nova_compute[237052]: + [[ ! -n '' ]] Oct 14 05:41:26 localhost nova_compute[237052]: + . kolla_extend_start Oct 14 05:41:26 localhost nova_compute[237052]: + echo 'Running command: '\''nova-compute'\''' Oct 14 05:41:26 localhost nova_compute[237052]: Running command: 'nova-compute' Oct 14 05:41:26 localhost nova_compute[237052]: + umask 0022 Oct 14 05:41:26 localhost nova_compute[237052]: + exec nova-compute Oct 14 05:41:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:41:28 localhost systemd[1]: tmp-crun.da8nY8.mount: Deactivated successfully. Oct 14 05:41:28 localhost podman[237198]: 2025-10-14 09:41:28.178330433 +0000 UTC m=+0.518337742 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:41:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23720 DF PROTO=TCP SPT=43830 DPT=9882 SEQ=2353291800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B193A0000000001030307) Oct 14 05:41:28 localhost podman[237198]: 2025-10-14 09:41:28.34612021 +0000 UTC m=+0.686127499 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.381 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.382 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.382 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.382 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.506 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.514 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:41:28 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:41:28 localhost nova_compute[237052]: 2025-10-14 09:41:28.989 2 INFO nova.virt.driver [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.102 2 INFO nova.compute.provider_config [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.110 2 WARNING nova.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.111 2 DEBUG oslo_concurrency.lockutils [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.112 2 DEBUG oslo_concurrency.lockutils [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.112 2 DEBUG oslo_concurrency.lockutils [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.112 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.112 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.112 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.112 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.113 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.114 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] console_host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.115 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.116 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.116 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.116 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.116 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.116 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.116 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.117 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.118 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.119 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.119 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.119 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.119 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.119 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.119 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.120 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.121 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.122 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.123 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.123 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.123 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.123 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.123 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.124 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.124 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.124 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.124 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.124 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.124 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.125 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.126 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.127 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.127 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.127 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.127 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.127 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.127 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.128 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.128 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.128 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.128 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.128 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.129 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.130 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.131 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.132 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.133 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.134 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.135 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.136 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.136 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.136 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.136 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.137 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.138 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.139 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.140 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.141 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.142 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.142 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.142 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.142 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.142 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.142 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.143 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.143 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.143 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.143 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.144 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.144 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.144 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.144 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.144 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.145 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.146 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.146 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.146 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.146 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.146 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.146 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.147 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.147 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.147 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.147 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.147 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.147 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.148 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.148 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.148 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.148 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.148 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.148 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.149 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.150 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.151 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.151 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.151 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.151 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.151 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.151 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.152 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.153 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.154 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.155 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.156 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.157 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.158 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.159 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.160 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.160 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.160 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.160 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.160 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.160 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.161 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.161 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.161 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.161 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.161 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.161 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.162 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.163 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.164 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.165 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.165 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.165 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.165 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.165 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.165 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.166 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.167 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.168 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.169 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.170 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.170 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.170 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.170 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.170 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.170 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.171 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.171 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.171 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.171 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.171 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.171 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.172 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.172 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.172 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.172 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.172 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.173 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.174 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.175 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.175 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.175 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.175 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.175 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.175 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.176 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.177 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.178 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.179 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.180 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.181 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.182 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.182 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.182 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.182 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.182 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.183 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.183 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.183 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.183 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.183 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.183 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.184 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.185 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.185 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.185 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.185 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.185 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.186 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.186 2 WARNING oslo_config.cfg [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Oct 14 05:41:29 localhost nova_compute[237052]: live_migration_uri is deprecated for removal in favor of two other options that Oct 14 05:41:29 localhost nova_compute[237052]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Oct 14 05:41:29 localhost nova_compute[237052]: and ``live_migration_inbound_addr`` respectively. Oct 14 05:41:29 localhost nova_compute[237052]: ). Its value may be silently ignored in the future.#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.186 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.186 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.186 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.187 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.187 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.187 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.187 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.187 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.187 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.188 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.188 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.188 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.188 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.188 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rbd_secret_uuid = fcadf6e2-9176-5818-a8d0-37b19acf8eaf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.189 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.190 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.190 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.190 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.190 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.190 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.190 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.191 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.191 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.191 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.191 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.191 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.191 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.192 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.193 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.194 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.194 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.194 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.194 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.194 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.194 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.195 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.196 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.196 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.196 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.196 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.196 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.196 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.197 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.198 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.198 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.198 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.198 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.198 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.199 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.199 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.199 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.199 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.199 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.199 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.200 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.200 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.200 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.200 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.200 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.201 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.202 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.203 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.204 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.205 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.206 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.207 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.207 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.207 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.207 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.207 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.208 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.209 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.210 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.211 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.212 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.213 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.214 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.214 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.214 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.214 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.214 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.214 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.215 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.216 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.216 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.216 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.216 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.216 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.216 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.217 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.218 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.219 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.220 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.221 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.222 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.223 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.223 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.223 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.223 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.223 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.223 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.224 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.225 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.225 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.225 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.225 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.225 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.225 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.226 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.226 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.226 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.226 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.226 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.227 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.227 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.227 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.227 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.227 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.227 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.228 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.228 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.228 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.228 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.228 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.228 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.229 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.230 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.231 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.232 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.232 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.232 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.232 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.232 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.233 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.233 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.233 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.233 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.233 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.233 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.234 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.235 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.235 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.235 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.235 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.235 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.235 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.236 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.237 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.238 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.239 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.240 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.240 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.240 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.240 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.240 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.240 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.241 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.242 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.242 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.242 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.242 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.242 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.242 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.243 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.244 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.245 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.246 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.246 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.246 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.246 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.246 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.246 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.247 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.248 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.249 2 DEBUG oslo_service.service [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.250 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.262 2 INFO nova.virt.node [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Determined node identity 18c24273-aca2-4f08-be57-3188d558235e from /var/lib/nova/compute_id#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.263 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.263 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.263 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.263 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.273 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.276 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.277 2 INFO nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Connection event '1' reason 'None'#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.317 2 DEBUG nova.virt.libvirt.volume.mount [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.317 2 INFO nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Libvirt host capabilities Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 1e17686e-e9d9-4f56-ae5b-e175ec048439 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: x86_64 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v4 Oct 14 05:41:29 localhost nova_compute[237052]: AMD Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: tcp Oct 14 05:41:29 localhost nova_compute[237052]: rdma Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 16116612 Oct 14 05:41:29 localhost nova_compute[237052]: 4029153 Oct 14 05:41:29 localhost nova_compute[237052]: 0 Oct 14 05:41:29 localhost nova_compute[237052]: 0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: selinux Oct 14 05:41:29 localhost nova_compute[237052]: 0 Oct 14 05:41:29 localhost nova_compute[237052]: system_u:system_r:svirt_t:s0 Oct 14 05:41:29 localhost nova_compute[237052]: system_u:system_r:svirt_tcg_t:s0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: dac Oct 14 05:41:29 localhost nova_compute[237052]: 0 Oct 14 05:41:29 localhost nova_compute[237052]: +107:+107 Oct 14 05:41:29 localhost nova_compute[237052]: +107:+107 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: hvm Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 32 Oct 14 05:41:29 localhost nova_compute[237052]: /usr/libexec/qemu-kvm Oct 14 05:41:29 localhost nova_compute[237052]: pc-i440fx-rhel7.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: q35 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.4.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.5.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.3.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel7.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.4.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.2.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.2.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.0.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.0.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.1.0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: hvm Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 64 Oct 14 05:41:29 localhost nova_compute[237052]: /usr/libexec/qemu-kvm Oct 14 05:41:29 localhost nova_compute[237052]: pc-i440fx-rhel7.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: q35 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.4.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.5.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.3.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel7.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.4.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.2.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.2.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.0.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.0.0 Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel8.1.0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: #033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.324 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.343 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/libexec/qemu-kvm Oct 14 05:41:29 localhost nova_compute[237052]: kvm Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: i686 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: rom Oct 14 05:41:29 localhost nova_compute[237052]: pflash Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: yes Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: AMD Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 486 Oct 14 05:41:29 localhost nova_compute[237052]: 486-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Conroe Oct 14 05:41:29 localhost nova_compute[237052]: Conroe-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-IBPB Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v4 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v1 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v2 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v6 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v7 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Penryn Oct 14 05:41:29 localhost nova_compute[237052]: Penryn-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Westmere Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v2 Oct 14 05:41:29 localhost nova_compute[237052]: athlon Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: athlon-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: kvm32 Oct 14 05:41:29 localhost nova_compute[237052]: kvm32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: n270 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: n270-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pentium Oct 14 05:41:29 localhost nova_compute[237052]: pentium-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: phenom Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: phenom-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu32 Oct 14 05:41:29 localhost nova_compute[237052]: qemu32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: file Oct 14 05:41:29 localhost nova_compute[237052]: anonymous Oct 14 05:41:29 localhost nova_compute[237052]: memfd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: disk Oct 14 05:41:29 localhost nova_compute[237052]: cdrom Oct 14 05:41:29 localhost nova_compute[237052]: floppy Oct 14 05:41:29 localhost nova_compute[237052]: lun Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: fdc Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: sata Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: vnc Oct 14 05:41:29 localhost nova_compute[237052]: egl-headless Oct 14 05:41:29 localhost nova_compute[237052]: dbus Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: subsystem Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: mandatory Oct 14 05:41:29 localhost nova_compute[237052]: requisite Oct 14 05:41:29 localhost nova_compute[237052]: optional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: pci Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: random Oct 14 05:41:29 localhost nova_compute[237052]: egd Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: path Oct 14 05:41:29 localhost nova_compute[237052]: handle Oct 14 05:41:29 localhost nova_compute[237052]: virtiofs Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: tpm-tis Oct 14 05:41:29 localhost nova_compute[237052]: tpm-crb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: emulator Oct 14 05:41:29 localhost nova_compute[237052]: external Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 2.0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pty Oct 14 05:41:29 localhost nova_compute[237052]: unix Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: passt Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: isa Oct 14 05:41:29 localhost nova_compute[237052]: hyperv Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: relaxed Oct 14 05:41:29 localhost nova_compute[237052]: vapic Oct 14 05:41:29 localhost nova_compute[237052]: spinlocks Oct 14 05:41:29 localhost nova_compute[237052]: vpindex Oct 14 05:41:29 localhost nova_compute[237052]: runtime Oct 14 05:41:29 localhost nova_compute[237052]: synic Oct 14 05:41:29 localhost nova_compute[237052]: stimer Oct 14 05:41:29 localhost nova_compute[237052]: reset Oct 14 05:41:29 localhost nova_compute[237052]: vendor_id Oct 14 05:41:29 localhost nova_compute[237052]: frequencies Oct 14 05:41:29 localhost nova_compute[237052]: reenlightenment Oct 14 05:41:29 localhost nova_compute[237052]: tlbflush Oct 14 05:41:29 localhost nova_compute[237052]: ipi Oct 14 05:41:29 localhost nova_compute[237052]: avic Oct 14 05:41:29 localhost nova_compute[237052]: emsr_bitmap Oct 14 05:41:29 localhost nova_compute[237052]: xmm_input Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.350 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/libexec/qemu-kvm Oct 14 05:41:29 localhost nova_compute[237052]: kvm Oct 14 05:41:29 localhost nova_compute[237052]: pc-i440fx-rhel7.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: i686 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: rom Oct 14 05:41:29 localhost nova_compute[237052]: pflash Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: yes Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: AMD Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 486 Oct 14 05:41:29 localhost nova_compute[237052]: 486-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Conroe Oct 14 05:41:29 localhost nova_compute[237052]: Conroe-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-IBPB Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v4 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v1 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v2 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v6 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v7 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Penryn Oct 14 05:41:29 localhost nova_compute[237052]: Penryn-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Westmere Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v2 Oct 14 05:41:29 localhost nova_compute[237052]: athlon Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: athlon-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: kvm32 Oct 14 05:41:29 localhost nova_compute[237052]: kvm32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: n270 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: n270-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pentium Oct 14 05:41:29 localhost nova_compute[237052]: pentium-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: phenom Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: phenom-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu32 Oct 14 05:41:29 localhost nova_compute[237052]: qemu32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: file Oct 14 05:41:29 localhost nova_compute[237052]: anonymous Oct 14 05:41:29 localhost nova_compute[237052]: memfd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: disk Oct 14 05:41:29 localhost nova_compute[237052]: cdrom Oct 14 05:41:29 localhost nova_compute[237052]: floppy Oct 14 05:41:29 localhost nova_compute[237052]: lun Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: ide Oct 14 05:41:29 localhost nova_compute[237052]: fdc Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: sata Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: vnc Oct 14 05:41:29 localhost nova_compute[237052]: egl-headless Oct 14 05:41:29 localhost nova_compute[237052]: dbus Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: subsystem Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: mandatory Oct 14 05:41:29 localhost nova_compute[237052]: requisite Oct 14 05:41:29 localhost nova_compute[237052]: optional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: pci Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: random Oct 14 05:41:29 localhost nova_compute[237052]: egd Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: path Oct 14 05:41:29 localhost nova_compute[237052]: handle Oct 14 05:41:29 localhost nova_compute[237052]: virtiofs Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: tpm-tis Oct 14 05:41:29 localhost nova_compute[237052]: tpm-crb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: emulator Oct 14 05:41:29 localhost nova_compute[237052]: external Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 2.0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pty Oct 14 05:41:29 localhost nova_compute[237052]: unix Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: passt Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: isa Oct 14 05:41:29 localhost nova_compute[237052]: hyperv Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: relaxed Oct 14 05:41:29 localhost nova_compute[237052]: vapic Oct 14 05:41:29 localhost nova_compute[237052]: spinlocks Oct 14 05:41:29 localhost nova_compute[237052]: vpindex Oct 14 05:41:29 localhost nova_compute[237052]: runtime Oct 14 05:41:29 localhost nova_compute[237052]: synic Oct 14 05:41:29 localhost nova_compute[237052]: stimer Oct 14 05:41:29 localhost nova_compute[237052]: reset Oct 14 05:41:29 localhost nova_compute[237052]: vendor_id Oct 14 05:41:29 localhost nova_compute[237052]: frequencies Oct 14 05:41:29 localhost nova_compute[237052]: reenlightenment Oct 14 05:41:29 localhost nova_compute[237052]: tlbflush Oct 14 05:41:29 localhost nova_compute[237052]: ipi Oct 14 05:41:29 localhost nova_compute[237052]: avic Oct 14 05:41:29 localhost nova_compute[237052]: emsr_bitmap Oct 14 05:41:29 localhost nova_compute[237052]: xmm_input Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.373 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.379 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/libexec/qemu-kvm Oct 14 05:41:29 localhost nova_compute[237052]: kvm Oct 14 05:41:29 localhost nova_compute[237052]: pc-q35-rhel9.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: x86_64 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: efi Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/edk2/ovmf/OVMF_CODE.fd Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: rom Oct 14 05:41:29 localhost nova_compute[237052]: pflash Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: yes Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: yes Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: AMD Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 486 Oct 14 05:41:29 localhost nova_compute[237052]: 486-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Conroe Oct 14 05:41:29 localhost nova_compute[237052]: Conroe-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-IBPB Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v4 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v1 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v2 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v6 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v7 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Penryn Oct 14 05:41:29 localhost nova_compute[237052]: Penryn-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Westmere Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v2 Oct 14 05:41:29 localhost nova_compute[237052]: athlon Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: athlon-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: kvm32 Oct 14 05:41:29 localhost nova_compute[237052]: kvm32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: n270 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: n270-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pentium Oct 14 05:41:29 localhost nova_compute[237052]: pentium-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: phenom Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: phenom-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu32 Oct 14 05:41:29 localhost nova_compute[237052]: qemu32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: file Oct 14 05:41:29 localhost nova_compute[237052]: anonymous Oct 14 05:41:29 localhost nova_compute[237052]: memfd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: disk Oct 14 05:41:29 localhost nova_compute[237052]: cdrom Oct 14 05:41:29 localhost nova_compute[237052]: floppy Oct 14 05:41:29 localhost nova_compute[237052]: lun Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: fdc Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: sata Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: vnc Oct 14 05:41:29 localhost nova_compute[237052]: egl-headless Oct 14 05:41:29 localhost nova_compute[237052]: dbus Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: subsystem Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: mandatory Oct 14 05:41:29 localhost nova_compute[237052]: requisite Oct 14 05:41:29 localhost nova_compute[237052]: optional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: pci Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: random Oct 14 05:41:29 localhost nova_compute[237052]: egd Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: path Oct 14 05:41:29 localhost nova_compute[237052]: handle Oct 14 05:41:29 localhost nova_compute[237052]: virtiofs Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: tpm-tis Oct 14 05:41:29 localhost nova_compute[237052]: tpm-crb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: emulator Oct 14 05:41:29 localhost nova_compute[237052]: external Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 2.0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pty Oct 14 05:41:29 localhost nova_compute[237052]: unix Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: passt Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: isa Oct 14 05:41:29 localhost nova_compute[237052]: hyperv Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: relaxed Oct 14 05:41:29 localhost nova_compute[237052]: vapic Oct 14 05:41:29 localhost nova_compute[237052]: spinlocks Oct 14 05:41:29 localhost nova_compute[237052]: vpindex Oct 14 05:41:29 localhost nova_compute[237052]: runtime Oct 14 05:41:29 localhost nova_compute[237052]: synic Oct 14 05:41:29 localhost nova_compute[237052]: stimer Oct 14 05:41:29 localhost nova_compute[237052]: reset Oct 14 05:41:29 localhost nova_compute[237052]: vendor_id Oct 14 05:41:29 localhost nova_compute[237052]: frequencies Oct 14 05:41:29 localhost nova_compute[237052]: reenlightenment Oct 14 05:41:29 localhost nova_compute[237052]: tlbflush Oct 14 05:41:29 localhost nova_compute[237052]: ipi Oct 14 05:41:29 localhost nova_compute[237052]: avic Oct 14 05:41:29 localhost nova_compute[237052]: emsr_bitmap Oct 14 05:41:29 localhost nova_compute[237052]: xmm_input Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.432 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/libexec/qemu-kvm Oct 14 05:41:29 localhost nova_compute[237052]: kvm Oct 14 05:41:29 localhost nova_compute[237052]: pc-i440fx-rhel7.6.0 Oct 14 05:41:29 localhost nova_compute[237052]: x86_64 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: rom Oct 14 05:41:29 localhost nova_compute[237052]: pflash Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: yes Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: no Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: on Oct 14 05:41:29 localhost nova_compute[237052]: off Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: AMD Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 486 Oct 14 05:41:29 localhost nova_compute[237052]: 486-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Broadwell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cascadelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Conroe Oct 14 05:41:29 localhost nova_compute[237052]: Conroe-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Cooperlake-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Denverton-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Dhyana-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Genoa-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-IBPB Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Milan-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-Rome-v4 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v1 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v2 Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: EPYC-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: GraniteRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Haswell-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-noTSX Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v6 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Icelake-Server-v7 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: IvyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: KnightsMill-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Nehalem-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G1-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G4-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Opteron_G5-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Penryn Oct 14 05:41:29 localhost nova_compute[237052]: Penryn-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: SandyBridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SapphireRapids-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: SierraForest-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Client-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-noTSX-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Skylake-Server-v5 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v2 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v3 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Snowridge-v4 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Westmere Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-IBRS Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Westmere-v2 Oct 14 05:41:29 localhost nova_compute[237052]: athlon Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: athlon-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: core2duo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: coreduo-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: kvm32 Oct 14 05:41:29 localhost nova_compute[237052]: kvm32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64 Oct 14 05:41:29 localhost nova_compute[237052]: kvm64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: n270 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: n270-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pentium Oct 14 05:41:29 localhost nova_compute[237052]: pentium-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2 Oct 14 05:41:29 localhost nova_compute[237052]: pentium2-v1 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3 Oct 14 05:41:29 localhost nova_compute[237052]: pentium3-v1 Oct 14 05:41:29 localhost nova_compute[237052]: phenom Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: phenom-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu32 Oct 14 05:41:29 localhost nova_compute[237052]: qemu32-v1 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64 Oct 14 05:41:29 localhost nova_compute[237052]: qemu64-v1 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: file Oct 14 05:41:29 localhost nova_compute[237052]: anonymous Oct 14 05:41:29 localhost nova_compute[237052]: memfd Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: disk Oct 14 05:41:29 localhost nova_compute[237052]: cdrom Oct 14 05:41:29 localhost nova_compute[237052]: floppy Oct 14 05:41:29 localhost nova_compute[237052]: lun Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: ide Oct 14 05:41:29 localhost nova_compute[237052]: fdc Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: sata Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: vnc Oct 14 05:41:29 localhost nova_compute[237052]: egl-headless Oct 14 05:41:29 localhost nova_compute[237052]: dbus Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: subsystem Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: mandatory Oct 14 05:41:29 localhost nova_compute[237052]: requisite Oct 14 05:41:29 localhost nova_compute[237052]: optional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: pci Oct 14 05:41:29 localhost nova_compute[237052]: scsi Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: virtio Oct 14 05:41:29 localhost nova_compute[237052]: virtio-transitional Oct 14 05:41:29 localhost nova_compute[237052]: virtio-non-transitional Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: random Oct 14 05:41:29 localhost nova_compute[237052]: egd Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: path Oct 14 05:41:29 localhost nova_compute[237052]: handle Oct 14 05:41:29 localhost nova_compute[237052]: virtiofs Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: tpm-tis Oct 14 05:41:29 localhost nova_compute[237052]: tpm-crb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: emulator Oct 14 05:41:29 localhost nova_compute[237052]: external Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: 2.0 Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: usb Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: pty Oct 14 05:41:29 localhost nova_compute[237052]: unix Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: qemu Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: builtin Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: default Oct 14 05:41:29 localhost nova_compute[237052]: passt Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: isa Oct 14 05:41:29 localhost nova_compute[237052]: hyperv Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: relaxed Oct 14 05:41:29 localhost nova_compute[237052]: vapic Oct 14 05:41:29 localhost nova_compute[237052]: spinlocks Oct 14 05:41:29 localhost nova_compute[237052]: vpindex Oct 14 05:41:29 localhost nova_compute[237052]: runtime Oct 14 05:41:29 localhost nova_compute[237052]: synic Oct 14 05:41:29 localhost nova_compute[237052]: stimer Oct 14 05:41:29 localhost nova_compute[237052]: reset Oct 14 05:41:29 localhost nova_compute[237052]: vendor_id Oct 14 05:41:29 localhost nova_compute[237052]: frequencies Oct 14 05:41:29 localhost nova_compute[237052]: reenlightenment Oct 14 05:41:29 localhost nova_compute[237052]: tlbflush Oct 14 05:41:29 localhost nova_compute[237052]: ipi Oct 14 05:41:29 localhost nova_compute[237052]: avic Oct 14 05:41:29 localhost nova_compute[237052]: emsr_bitmap Oct 14 05:41:29 localhost nova_compute[237052]: xmm_input Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: Oct 14 05:41:29 localhost nova_compute[237052]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.485 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.486 2 INFO nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Secure Boot support detected#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.488 2 INFO nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.488 2 INFO nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.499 2 DEBUG nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.541 2 INFO nova.virt.node [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Determined node identity 18c24273-aca2-4f08-be57-3188d558235e from /var/lib/nova/compute_id#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.558 2 DEBUG nova.compute.manager [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Verified node 18c24273-aca2-4f08-be57-3188d558235e matches my host np0005486733.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.602 2 DEBUG nova.compute.manager [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.606 2 DEBUG nova.virt.libvirt.vif [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:37:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005486733.localdomain',hostname='test',id=2,image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-10-14T08:37:23Z,launched_on='np0005486733.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005486733.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='41187b090f3d4818a32baa37ce8a3991',ramdisk_id='',reservation_id='r-aao7l1tg',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-10-14T08:37:23Z,user_data=None,user_id='9d85e6ce130c46ec855f37147dbb08b4',uuid=88c4e366-b765-47a6-96bf-f7677f2ce67c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.606 2 DEBUG nova.network.os_vif_util [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Converting VIF {"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.607 2 DEBUG nova.network.os_vif_util [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.608 2 DEBUG os_vif [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.643 2 DEBUG ovsdbapp.backend.ovs_idl [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.643 2 DEBUG ovsdbapp.backend.ovs_idl [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.643 2 DEBUG ovsdbapp.backend.ovs_idl [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.666 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.666 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:41:29 localhost nova_compute[237052]: 2025-10-14 09:41:29.667 2 INFO oslo.privsep.daemon [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp5lo3ek_2/privsep.sock']#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.290 2 INFO oslo.privsep.daemon [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.200 40 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.205 40 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.209 40 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.209 40 INFO oslo.privsep.daemon [-] privsep daemon running as pid 40#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec9b060-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ec9b060-f4, col_values=(('external_ids', {'iface-id': '3ec9b060-f43d-4698-9c76-6062c70911d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:5e:e5', 'vm-uuid': '88c4e366-b765-47a6-96bf-f7677f2ce67c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.566 2 INFO os_vif [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4')#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.566 2 DEBUG nova.compute.manager [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.569 2 DEBUG nova.compute.manager [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Oct 14 05:41:30 localhost nova_compute[237052]: 2025-10-14 09:41:30.569 2 INFO nova.compute.manager [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Oct 14 05:41:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11512 DF PROTO=TCP SPT=33168 DPT=9102 SEQ=2071518500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B225A0000000001030307) Oct 14 05:41:30 localhost python3.9[237403]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.300 2 INFO nova.service [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating service version for nova-compute on np0005486733.localdomain from 57 to 66#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.333 2 DEBUG oslo_concurrency.lockutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.333 2 DEBUG oslo_concurrency.lockutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.333 2 DEBUG oslo_concurrency.lockutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.333 2 DEBUG nova.compute.resource_tracker [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.334 2 DEBUG oslo_concurrency.processutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.788 2 DEBUG oslo_concurrency.processutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.875 2 DEBUG nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:41:31 localhost nova_compute[237052]: 2025-10-14 09:41:31.876 2 DEBUG nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:41:31 localhost systemd[1]: Starting libvirt nodedev daemon... Oct 14 05:41:31 localhost systemd[1]: Started libvirt nodedev daemon. Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.134 2 WARNING nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.135 2 DEBUG nova.compute.resource_tracker [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12896MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.135 2 DEBUG oslo_concurrency.lockutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.136 2 DEBUG oslo_concurrency.lockutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.349 2 DEBUG nova.compute.resource_tracker [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.349 2 DEBUG nova.compute.resource_tracker [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.350 2 DEBUG nova.compute.resource_tracker [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:41:32 localhost python3.9[237589]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.562 2 DEBUG nova.scheduler.client.report [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.593 2 DEBUG nova.scheduler.client.report [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.594 2 DEBUG nova.compute.provider_tree [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.609 2 DEBUG nova.scheduler.client.report [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.635 2 DEBUG nova.scheduler.client.report [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ACCELERATORS,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,HW_CPU_X86_SSE4A,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_BMI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_MMX,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 05:41:32 localhost nova_compute[237052]: 2025-10-14 09:41:32.684 2 DEBUG oslo_concurrency.processutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.083 2 DEBUG oslo_concurrency.processutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.089 2 DEBUG nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Oct 14 05:41:33 localhost nova_compute[237052]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.090 2 INFO nova.virt.libvirt.host [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] kernel doesn't support AMD SEV#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.091 2 DEBUG nova.compute.provider_tree [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.092 2 DEBUG nova.virt.libvirt.driver [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.173 2 DEBUG nova.scheduler.client.report [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updated inventory for provider 18c24273-aca2-4f08-be57-3188d558235e with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.174 2 DEBUG nova.compute.provider_tree [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating resource provider 18c24273-aca2-4f08-be57-3188d558235e generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.174 2 DEBUG nova.compute.provider_tree [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:41:33 localhost python3.9[237717]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.274 2 DEBUG nova.compute.provider_tree [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Updating resource provider 18c24273-aca2-4f08-be57-3188d558235e generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.296 2 DEBUG nova.compute.resource_tracker [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.296 2 DEBUG oslo_concurrency.lockutils [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.296 2 DEBUG nova.service [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.362 2 DEBUG nova.service [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Oct 14 05:41:33 localhost nova_compute[237052]: 2025-10-14 09:41:33.363 2 DEBUG nova.servicegroup.drivers.db [None req-1f5a91af-59e4-4791-a59a-d3dbd2fd180c - - - - - -] DB_Driver: join new ServiceGroup member np0005486733.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Oct 14 05:41:34 localhost nova_compute[237052]: 2025-10-14 09:41:34.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:34 localhost nova_compute[237052]: 2025-10-14 09:41:34.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:34 localhost python3.9[237829]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Oct 14 05:41:34 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 116.2 (387 of 333 items), suggesting rotation. Oct 14 05:41:34 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:41:34 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:41:35 localhost nova_compute[237052]: 2025-10-14 09:41:35.365 2 DEBUG oslo_service.periodic_task [None req-c1ce1b0e-798d-4d22-9194-75e6967d4a0d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:41:35 localhost nova_compute[237052]: 2025-10-14 09:41:35.398 2 DEBUG nova.compute.manager [None req-c1ce1b0e-798d-4d22-9194-75e6967d4a0d - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 05:41:35 localhost nova_compute[237052]: 2025-10-14 09:41:35.398 2 DEBUG oslo_concurrency.lockutils [None req-c1ce1b0e-798d-4d22-9194-75e6967d4a0d - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:41:35 localhost nova_compute[237052]: 2025-10-14 09:41:35.399 2 DEBUG oslo_concurrency.lockutils [None req-c1ce1b0e-798d-4d22-9194-75e6967d4a0d - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:41:35 localhost nova_compute[237052]: 2025-10-14 09:41:35.399 2 DEBUG oslo_service.periodic_task [None req-c1ce1b0e-798d-4d22-9194-75e6967d4a0d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:41:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23723 DF PROTO=TCP SPT=43830 DPT=9882 SEQ=2353291800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B351A0000000001030307) Oct 14 05:41:35 localhost nova_compute[237052]: 2025-10-14 09:41:35.501 2 DEBUG oslo_concurrency.lockutils [None req-c1ce1b0e-798d-4d22-9194-75e6967d4a0d - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:41:35 localhost python3.9[237965]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:41:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:41:35 localhost systemd[1]: Stopping nova_compute container... Oct 14 05:41:36 localhost podman[237967]: 2025-10-14 09:41:35.994618417 +0000 UTC m=+0.085203509 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:41:36 localhost podman[237967]: 2025-10-14 09:41:36.075192399 +0000 UTC m=+0.165777541 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:41:36 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:41:37 localhost nova_compute[237052]: 2025-10-14 09:41:37.479 2 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Oct 14 05:41:37 localhost nova_compute[237052]: 2025-10-14 09:41:37.481 2 DEBUG oslo_concurrency.lockutils [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:41:37 localhost nova_compute[237052]: 2025-10-14 09:41:37.481 2 DEBUG oslo_concurrency.lockutils [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:41:37 localhost nova_compute[237052]: 2025-10-14 09:41:37.481 2 DEBUG oslo_concurrency.lockutils [None req-922dfad0-72af-4b6b-89cc-4fe9233e2928 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:41:37 localhost podman[238007]: 2025-10-14 09:41:37.741119248 +0000 UTC m=+0.080096818 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2) Oct 14 05:41:37 localhost podman[238007]: 2025-10-14 09:41:37.749133042 +0000 UTC m=+0.088110602 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:41:37 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:41:37 localhost journal[206742]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, ) Oct 14 05:41:37 localhost journal[206742]: hostname: np0005486733.localdomain Oct 14 05:41:37 localhost journal[206742]: End of file while reading data: Input/output error Oct 14 05:41:37 localhost systemd[1]: libpod-b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048.scope: Deactivated successfully. Oct 14 05:41:37 localhost systemd[1]: libpod-b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048.scope: Consumed 4.698s CPU time. Oct 14 05:41:37 localhost podman[237975]: 2025-10-14 09:41:37.883935227 +0000 UTC m=+1.946630794 container died b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3) Oct 14 05:41:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048-userdata-shm.mount: Deactivated successfully. Oct 14 05:41:37 localhost systemd[1]: var-lib-containers-storage-overlay-533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940-merged.mount: Deactivated successfully. Oct 14 05:41:37 localhost podman[237975]: 2025-10-14 09:41:37.940270172 +0000 UTC m=+2.002965709 container cleanup b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:41:37 localhost podman[237975]: nova_compute Oct 14 05:41:37 localhost podman[238025]: 2025-10-14 09:41:37.941587451 +0000 UTC m=+0.048730894 container cleanup b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:41:37 localhost podman[238040]: 2025-10-14 09:41:37.993233088 +0000 UTC m=+0.029659646 container cleanup b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:41:37 localhost podman[238040]: nova_compute Oct 14 05:41:37 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Oct 14 05:41:37 localhost systemd[1]: Stopped nova_compute container. Oct 14 05:41:38 localhost systemd[1]: Starting nova_compute container... Oct 14 05:41:38 localhost systemd[1]: Started libcrun container. Oct 14 05:41:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:38 localhost podman[238053]: 2025-10-14 09:41:38.11184703 +0000 UTC m=+0.091957564 container init b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:41:38 localhost podman[238053]: 2025-10-14 09:41:38.12004973 +0000 UTC m=+0.100160264 container start b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 05:41:38 localhost podman[238053]: nova_compute Oct 14 05:41:38 localhost nova_compute[238069]: + sudo -E kolla_set_configs Oct 14 05:41:38 localhost systemd[1]: Started nova_compute container. Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Validating config file Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying service configuration files Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/nova/nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/nova/nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /etc/ceph Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Creating directory /etc/ceph Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/ceph Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Writing out command to execute Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:38 localhost nova_compute[238069]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Oct 14 05:41:38 localhost nova_compute[238069]: ++ cat /run_command Oct 14 05:41:38 localhost nova_compute[238069]: + CMD=nova-compute Oct 14 05:41:38 localhost nova_compute[238069]: + ARGS= Oct 14 05:41:38 localhost nova_compute[238069]: + sudo kolla_copy_cacerts Oct 14 05:41:38 localhost nova_compute[238069]: + [[ ! -n '' ]] Oct 14 05:41:38 localhost nova_compute[238069]: + . kolla_extend_start Oct 14 05:41:38 localhost nova_compute[238069]: Running command: 'nova-compute' Oct 14 05:41:38 localhost nova_compute[238069]: + echo 'Running command: '\''nova-compute'\''' Oct 14 05:41:38 localhost nova_compute[238069]: + umask 0022 Oct 14 05:41:38 localhost nova_compute[238069]: + exec nova-compute Oct 14 05:41:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52087 DF PROTO=TCP SPT=58602 DPT=9105 SEQ=2252466973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B421E0000000001030307) Oct 14 05:41:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52088 DF PROTO=TCP SPT=58602 DPT=9105 SEQ=2252466973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B461A0000000001030307) Oct 14 05:41:39 localhost nova_compute[238069]: 2025-10-14 09:41:39.880 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:41:39 localhost nova_compute[238069]: 2025-10-14 09:41:39.881 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:41:39 localhost nova_compute[238069]: 2025-10-14 09:41:39.881 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:41:39 localhost nova_compute[238069]: 2025-10-14 09:41:39.881 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Oct 14 05:41:39 localhost nova_compute[238069]: 2025-10-14 09:41:39.993 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.015 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.612 2 INFO nova.virt.driver [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.727 2 INFO nova.compute.provider_config [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.738 2 WARNING nova.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.738 2 DEBUG oslo_concurrency.lockutils [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.739 2 DEBUG oslo_concurrency.lockutils [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.739 2 DEBUG oslo_concurrency.lockutils [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.739 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.739 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.739 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.739 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.740 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.741 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.741 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.741 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.741 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.741 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.741 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] console_host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.742 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.743 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.743 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.743 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.743 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.743 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.743 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.744 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.744 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.744 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.744 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.744 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.744 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.745 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.746 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.747 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.748 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.749 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.750 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.751 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.752 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.753 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.754 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.755 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.756 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.757 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.758 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.759 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.760 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.761 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.762 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.763 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.764 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.764 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.764 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.764 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.764 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.764 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.765 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.766 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.767 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.768 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.769 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.770 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.771 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.772 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.773 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.774 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.774 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.774 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.774 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.774 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.774 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.775 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.776 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.777 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.778 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.779 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.780 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.781 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.782 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.783 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.784 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.785 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.786 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.787 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.788 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.789 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.789 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.789 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.789 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.789 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.790 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.790 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.790 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.790 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.790 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.790 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.791 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.792 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.793 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.794 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.795 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.796 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.797 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.798 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.799 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.800 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.800 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.800 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.800 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.800 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.800 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.801 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.802 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.803 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.803 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.803 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.803 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.803 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.803 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.804 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.804 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.804 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.804 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.804 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.804 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.805 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.806 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.807 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.808 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.809 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.809 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.809 2 WARNING oslo_config.cfg [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Oct 14 05:41:40 localhost nova_compute[238069]: live_migration_uri is deprecated for removal in favor of two other options that Oct 14 05:41:40 localhost nova_compute[238069]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Oct 14 05:41:40 localhost nova_compute[238069]: and ``live_migration_inbound_addr`` respectively. Oct 14 05:41:40 localhost nova_compute[238069]: ). Its value may be silently ignored in the future.#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.809 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.810 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.810 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.810 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.810 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.810 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.810 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.811 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rbd_secret_uuid = fcadf6e2-9176-5818-a8d0-37b19acf8eaf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.812 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.813 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.814 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.814 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.814 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.814 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.814 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.814 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.815 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.816 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.816 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.816 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.816 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.816 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.816 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.817 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.817 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.817 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.817 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.817 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.817 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.818 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.819 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.820 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.821 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.822 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.823 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.824 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.824 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.824 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.824 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.824 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.824 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.825 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.826 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.826 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.826 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.826 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.826 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.827 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.828 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.829 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.830 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.830 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.830 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.830 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.830 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.830 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.831 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.832 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.833 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.834 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.835 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.836 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.836 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.836 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.836 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.836 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.836 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.837 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.838 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.839 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.840 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.841 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.842 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.843 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.844 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.845 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.846 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.846 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.846 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.846 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.846 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.847 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.848 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.849 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.850 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.851 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.852 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.853 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.854 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.855 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.856 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.856 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.856 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.856 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.856 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.857 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.858 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.858 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.858 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.858 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.858 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.858 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.859 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.860 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.861 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.862 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.863 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.864 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.864 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.864 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.864 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.864 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.864 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.865 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.866 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.867 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.867 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.867 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.867 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.867 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.867 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.868 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.869 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.869 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.869 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.869 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.869 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.869 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.870 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.870 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.870 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.870 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.870 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.870 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.871 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.871 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.871 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.871 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.871 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.871 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.872 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.872 2 DEBUG oslo_service.service [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.873 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.888 2 INFO nova.virt.node [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Determined node identity 18c24273-aca2-4f08-be57-3188d558235e from /var/lib/nova/compute_id#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.889 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.889 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.889 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.890 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.898 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.900 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.901 2 INFO nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Connection event '1' reason 'None'#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.908 2 INFO nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Libvirt host capabilities Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 1e17686e-e9d9-4f56-ae5b-e175ec048439 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: x86_64 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v4 Oct 14 05:41:40 localhost nova_compute[238069]: AMD Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: tcp Oct 14 05:41:40 localhost nova_compute[238069]: rdma Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 16116612 Oct 14 05:41:40 localhost nova_compute[238069]: 4029153 Oct 14 05:41:40 localhost nova_compute[238069]: 0 Oct 14 05:41:40 localhost nova_compute[238069]: 0 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: selinux Oct 14 05:41:40 localhost nova_compute[238069]: 0 Oct 14 05:41:40 localhost nova_compute[238069]: system_u:system_r:svirt_t:s0 Oct 14 05:41:40 localhost nova_compute[238069]: system_u:system_r:svirt_tcg_t:s0 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: dac Oct 14 05:41:40 localhost nova_compute[238069]: 0 Oct 14 05:41:40 localhost nova_compute[238069]: +107:+107 Oct 14 05:41:40 localhost nova_compute[238069]: +107:+107 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: hvm Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 32 Oct 14 05:41:40 localhost nova_compute[238069]: /usr/libexec/qemu-kvm Oct 14 05:41:40 localhost nova_compute[238069]: pc-i440fx-rhel7.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: q35 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.4.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.5.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.3.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel7.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.4.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.2.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.2.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.0.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.0.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.1.0 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: hvm Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 64 Oct 14 05:41:40 localhost nova_compute[238069]: /usr/libexec/qemu-kvm Oct 14 05:41:40 localhost nova_compute[238069]: pc-i440fx-rhel7.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: q35 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.4.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.5.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.3.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel7.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.4.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.2.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.2.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.0.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.0.0 Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel8.1.0 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: #033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.913 2 DEBUG nova.virt.libvirt.volume.mount [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.916 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.919 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: /usr/libexec/qemu-kvm Oct 14 05:41:40 localhost nova_compute[238069]: kvm Oct 14 05:41:40 localhost nova_compute[238069]: pc-q35-rhel9.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: i686 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: rom Oct 14 05:41:40 localhost nova_compute[238069]: pflash Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: yes Oct 14 05:41:40 localhost nova_compute[238069]: no Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: no Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: on Oct 14 05:41:40 localhost nova_compute[238069]: off Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: on Oct 14 05:41:40 localhost nova_compute[238069]: off Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:40 localhost nova_compute[238069]: AMD Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 486 Oct 14 05:41:40 localhost nova_compute[238069]: 486-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-noTSX-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v5 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Conroe Oct 14 05:41:40 localhost nova_compute[238069]: Conroe-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Cooperlake Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cooperlake-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cooperlake-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Dhyana Oct 14 05:41:40 localhost nova_compute[238069]: Dhyana-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Dhyana-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Genoa Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Genoa-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-IBPB Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Milan Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Milan-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Milan-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v4 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v1 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v2 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: GraniteRapids Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: GraniteRapids-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: GraniteRapids-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-noTSX-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v5 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v6 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v7 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: IvyBridge Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: IvyBridge-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: IvyBridge-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: IvyBridge-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: KnightsMill Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: KnightsMill-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Nehalem Oct 14 05:41:40 localhost nova_compute[238069]: Nehalem-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Nehalem-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Nehalem-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G1 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G1-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G2 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G2-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G3 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G3-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G4-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G5 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Opteron_G5-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Penryn Oct 14 05:41:40 localhost nova_compute[238069]: Penryn-v1 Oct 14 05:41:40 localhost nova_compute[238069]: SandyBridge Oct 14 05:41:40 localhost nova_compute[238069]: SandyBridge-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: SandyBridge-v1 Oct 14 05:41:40 localhost nova_compute[238069]: SandyBridge-v2 Oct 14 05:41:40 localhost nova_compute[238069]: SapphireRapids Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: SapphireRapids-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: SapphireRapids-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: SapphireRapids-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: SierraForest Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: SierraForest-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client-noTSX-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Client-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-noTSX-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Skylake-Server-v5 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Snowridge Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Snowridge-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Snowridge-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Snowridge-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Snowridge-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Westmere Oct 14 05:41:40 localhost nova_compute[238069]: Westmere-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Westmere-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Westmere-v2 Oct 14 05:41:40 localhost nova_compute[238069]: athlon Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: athlon-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: core2duo Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: core2duo-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: coreduo Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: coreduo-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: kvm32 Oct 14 05:41:40 localhost nova_compute[238069]: kvm32-v1 Oct 14 05:41:40 localhost nova_compute[238069]: kvm64 Oct 14 05:41:40 localhost nova_compute[238069]: kvm64-v1 Oct 14 05:41:40 localhost nova_compute[238069]: n270 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: n270-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: pentium Oct 14 05:41:40 localhost nova_compute[238069]: pentium-v1 Oct 14 05:41:40 localhost nova_compute[238069]: pentium2 Oct 14 05:41:40 localhost nova_compute[238069]: pentium2-v1 Oct 14 05:41:40 localhost nova_compute[238069]: pentium3 Oct 14 05:41:40 localhost nova_compute[238069]: pentium3-v1 Oct 14 05:41:40 localhost nova_compute[238069]: phenom Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: phenom-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: qemu32 Oct 14 05:41:40 localhost nova_compute[238069]: qemu32-v1 Oct 14 05:41:40 localhost nova_compute[238069]: qemu64 Oct 14 05:41:40 localhost nova_compute[238069]: qemu64-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: file Oct 14 05:41:40 localhost nova_compute[238069]: anonymous Oct 14 05:41:40 localhost nova_compute[238069]: memfd Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: disk Oct 14 05:41:40 localhost nova_compute[238069]: cdrom Oct 14 05:41:40 localhost nova_compute[238069]: floppy Oct 14 05:41:40 localhost nova_compute[238069]: lun Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: fdc Oct 14 05:41:40 localhost nova_compute[238069]: scsi Oct 14 05:41:40 localhost nova_compute[238069]: virtio Oct 14 05:41:40 localhost nova_compute[238069]: usb Oct 14 05:41:40 localhost nova_compute[238069]: sata Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: virtio Oct 14 05:41:40 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:40 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: vnc Oct 14 05:41:40 localhost nova_compute[238069]: egl-headless Oct 14 05:41:40 localhost nova_compute[238069]: dbus Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: subsystem Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: default Oct 14 05:41:40 localhost nova_compute[238069]: mandatory Oct 14 05:41:40 localhost nova_compute[238069]: requisite Oct 14 05:41:40 localhost nova_compute[238069]: optional Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: usb Oct 14 05:41:40 localhost nova_compute[238069]: pci Oct 14 05:41:40 localhost nova_compute[238069]: scsi Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: virtio Oct 14 05:41:40 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:40 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: random Oct 14 05:41:40 localhost nova_compute[238069]: egd Oct 14 05:41:40 localhost nova_compute[238069]: builtin Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: path Oct 14 05:41:40 localhost nova_compute[238069]: handle Oct 14 05:41:40 localhost nova_compute[238069]: virtiofs Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: tpm-tis Oct 14 05:41:40 localhost nova_compute[238069]: tpm-crb Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: emulator Oct 14 05:41:40 localhost nova_compute[238069]: external Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 2.0 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: usb Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: pty Oct 14 05:41:40 localhost nova_compute[238069]: unix Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: qemu Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: builtin Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: default Oct 14 05:41:40 localhost nova_compute[238069]: passt Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: isa Oct 14 05:41:40 localhost nova_compute[238069]: hyperv Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: relaxed Oct 14 05:41:40 localhost nova_compute[238069]: vapic Oct 14 05:41:40 localhost nova_compute[238069]: spinlocks Oct 14 05:41:40 localhost nova_compute[238069]: vpindex Oct 14 05:41:40 localhost nova_compute[238069]: runtime Oct 14 05:41:40 localhost nova_compute[238069]: synic Oct 14 05:41:40 localhost nova_compute[238069]: stimer Oct 14 05:41:40 localhost nova_compute[238069]: reset Oct 14 05:41:40 localhost nova_compute[238069]: vendor_id Oct 14 05:41:40 localhost nova_compute[238069]: frequencies Oct 14 05:41:40 localhost nova_compute[238069]: reenlightenment Oct 14 05:41:40 localhost nova_compute[238069]: tlbflush Oct 14 05:41:40 localhost nova_compute[238069]: ipi Oct 14 05:41:40 localhost nova_compute[238069]: avic Oct 14 05:41:40 localhost nova_compute[238069]: emsr_bitmap Oct 14 05:41:40 localhost nova_compute[238069]: xmm_input Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:40 localhost nova_compute[238069]: 2025-10-14 09:41:40.925 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: /usr/libexec/qemu-kvm Oct 14 05:41:40 localhost nova_compute[238069]: kvm Oct 14 05:41:40 localhost nova_compute[238069]: pc-i440fx-rhel7.6.0 Oct 14 05:41:40 localhost nova_compute[238069]: i686 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: rom Oct 14 05:41:40 localhost nova_compute[238069]: pflash Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: yes Oct 14 05:41:40 localhost nova_compute[238069]: no Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: no Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: on Oct 14 05:41:40 localhost nova_compute[238069]: off Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: on Oct 14 05:41:40 localhost nova_compute[238069]: off Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:40 localhost nova_compute[238069]: AMD Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: 486 Oct 14 05:41:40 localhost nova_compute[238069]: 486-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-noTSX-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Broadwell-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cascadelake-Server-v5 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Conroe Oct 14 05:41:40 localhost nova_compute[238069]: Conroe-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Cooperlake Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cooperlake-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Cooperlake-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Denverton-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Dhyana Oct 14 05:41:40 localhost nova_compute[238069]: Dhyana-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Dhyana-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Genoa Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Genoa-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-IBPB Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Milan Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Milan-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Milan-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-Rome-v4 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v1 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v2 Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: EPYC-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: GraniteRapids Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: GraniteRapids-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: GraniteRapids-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-noTSX-IBRS Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Haswell-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-noTSX Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v1 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v2 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v3 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v4 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v5 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v6 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Icelake-Server-v7 Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: IvyBridge Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:40 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: KnightsMill Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: KnightsMill-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G1-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G2 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G2-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G3 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G3-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G4-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G5-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Penryn Oct 14 05:41:41 localhost nova_compute[238069]: Penryn-v1 Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SierraForest Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SierraForest-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Westmere Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-v2 Oct 14 05:41:41 localhost nova_compute[238069]: athlon Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: athlon-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: core2duo Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: core2duo-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: coreduo Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: coreduo-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: kvm32 Oct 14 05:41:41 localhost nova_compute[238069]: kvm32-v1 Oct 14 05:41:41 localhost nova_compute[238069]: kvm64 Oct 14 05:41:41 localhost nova_compute[238069]: kvm64-v1 Oct 14 05:41:41 localhost nova_compute[238069]: n270 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: n270-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: pentium Oct 14 05:41:41 localhost nova_compute[238069]: pentium-v1 Oct 14 05:41:41 localhost nova_compute[238069]: pentium2 Oct 14 05:41:41 localhost nova_compute[238069]: pentium2-v1 Oct 14 05:41:41 localhost nova_compute[238069]: pentium3 Oct 14 05:41:41 localhost nova_compute[238069]: pentium3-v1 Oct 14 05:41:41 localhost nova_compute[238069]: phenom Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: phenom-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: qemu32 Oct 14 05:41:41 localhost nova_compute[238069]: qemu32-v1 Oct 14 05:41:41 localhost nova_compute[238069]: qemu64 Oct 14 05:41:41 localhost nova_compute[238069]: qemu64-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: file Oct 14 05:41:41 localhost nova_compute[238069]: anonymous Oct 14 05:41:41 localhost nova_compute[238069]: memfd Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: disk Oct 14 05:41:41 localhost nova_compute[238069]: cdrom Oct 14 05:41:41 localhost nova_compute[238069]: floppy Oct 14 05:41:41 localhost nova_compute[238069]: lun Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: ide Oct 14 05:41:41 localhost nova_compute[238069]: fdc Oct 14 05:41:41 localhost nova_compute[238069]: scsi Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: sata Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:41 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: vnc Oct 14 05:41:41 localhost nova_compute[238069]: egl-headless Oct 14 05:41:41 localhost nova_compute[238069]: dbus Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: subsystem Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: default Oct 14 05:41:41 localhost nova_compute[238069]: mandatory Oct 14 05:41:41 localhost nova_compute[238069]: requisite Oct 14 05:41:41 localhost nova_compute[238069]: optional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: pci Oct 14 05:41:41 localhost nova_compute[238069]: scsi Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:41 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: random Oct 14 05:41:41 localhost nova_compute[238069]: egd Oct 14 05:41:41 localhost nova_compute[238069]: builtin Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: path Oct 14 05:41:41 localhost nova_compute[238069]: handle Oct 14 05:41:41 localhost nova_compute[238069]: virtiofs Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: tpm-tis Oct 14 05:41:41 localhost nova_compute[238069]: tpm-crb Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: emulator Oct 14 05:41:41 localhost nova_compute[238069]: external Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: 2.0 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: pty Oct 14 05:41:41 localhost nova_compute[238069]: unix Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: qemu Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: builtin Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: default Oct 14 05:41:41 localhost nova_compute[238069]: passt Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: isa Oct 14 05:41:41 localhost nova_compute[238069]: hyperv Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: relaxed Oct 14 05:41:41 localhost nova_compute[238069]: vapic Oct 14 05:41:41 localhost nova_compute[238069]: spinlocks Oct 14 05:41:41 localhost nova_compute[238069]: vpindex Oct 14 05:41:41 localhost nova_compute[238069]: runtime Oct 14 05:41:41 localhost nova_compute[238069]: synic Oct 14 05:41:41 localhost nova_compute[238069]: stimer Oct 14 05:41:41 localhost nova_compute[238069]: reset Oct 14 05:41:41 localhost nova_compute[238069]: vendor_id Oct 14 05:41:41 localhost nova_compute[238069]: frequencies Oct 14 05:41:41 localhost nova_compute[238069]: reenlightenment Oct 14 05:41:41 localhost nova_compute[238069]: tlbflush Oct 14 05:41:41 localhost nova_compute[238069]: ipi Oct 14 05:41:41 localhost nova_compute[238069]: avic Oct 14 05:41:41 localhost nova_compute[238069]: emsr_bitmap Oct 14 05:41:41 localhost nova_compute[238069]: xmm_input Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:40.948 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:40.953 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: /usr/libexec/qemu-kvm Oct 14 05:41:41 localhost nova_compute[238069]: kvm Oct 14 05:41:41 localhost nova_compute[238069]: pc-q35-rhel9.6.0 Oct 14 05:41:41 localhost nova_compute[238069]: x86_64 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: efi Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Oct 14 05:41:41 localhost nova_compute[238069]: /usr/share/edk2/ovmf/OVMF_CODE.fd Oct 14 05:41:41 localhost nova_compute[238069]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Oct 14 05:41:41 localhost nova_compute[238069]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: rom Oct 14 05:41:41 localhost nova_compute[238069]: pflash Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: yes Oct 14 05:41:41 localhost nova_compute[238069]: no Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: yes Oct 14 05:41:41 localhost nova_compute[238069]: no Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: on Oct 14 05:41:41 localhost nova_compute[238069]: off Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: on Oct 14 05:41:41 localhost nova_compute[238069]: off Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:41 localhost nova_compute[238069]: AMD Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: 486 Oct 14 05:41:41 localhost nova_compute[238069]: 486-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Conroe Oct 14 05:41:41 localhost nova_compute[238069]: Conroe-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Cooperlake Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cooperlake-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cooperlake-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Dhyana Oct 14 05:41:41 localhost nova_compute[238069]: Dhyana-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Dhyana-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Genoa Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Genoa-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-IBPB Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Milan Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Milan-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Milan-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v4 Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v1 Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v2 Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: GraniteRapids Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: GraniteRapids-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: GraniteRapids-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v6 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v7 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: KnightsMill Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: KnightsMill-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G1-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G2 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G2-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G3 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G3-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G4-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G5-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Penryn Oct 14 05:41:41 localhost nova_compute[238069]: Penryn-v1 Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SierraForest Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SierraForest-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Westmere Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-v2 Oct 14 05:41:41 localhost nova_compute[238069]: athlon Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: athlon-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: core2duo Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: core2duo-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: coreduo Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: coreduo-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: kvm32 Oct 14 05:41:41 localhost nova_compute[238069]: kvm32-v1 Oct 14 05:41:41 localhost nova_compute[238069]: kvm64 Oct 14 05:41:41 localhost nova_compute[238069]: kvm64-v1 Oct 14 05:41:41 localhost nova_compute[238069]: n270 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: n270-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: pentium Oct 14 05:41:41 localhost nova_compute[238069]: pentium-v1 Oct 14 05:41:41 localhost nova_compute[238069]: pentium2 Oct 14 05:41:41 localhost nova_compute[238069]: pentium2-v1 Oct 14 05:41:41 localhost nova_compute[238069]: pentium3 Oct 14 05:41:41 localhost nova_compute[238069]: pentium3-v1 Oct 14 05:41:41 localhost nova_compute[238069]: phenom Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: phenom-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: qemu32 Oct 14 05:41:41 localhost nova_compute[238069]: qemu32-v1 Oct 14 05:41:41 localhost nova_compute[238069]: qemu64 Oct 14 05:41:41 localhost nova_compute[238069]: qemu64-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: file Oct 14 05:41:41 localhost nova_compute[238069]: anonymous Oct 14 05:41:41 localhost nova_compute[238069]: memfd Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: disk Oct 14 05:41:41 localhost nova_compute[238069]: cdrom Oct 14 05:41:41 localhost nova_compute[238069]: floppy Oct 14 05:41:41 localhost nova_compute[238069]: lun Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: fdc Oct 14 05:41:41 localhost nova_compute[238069]: scsi Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: sata Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:41 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: vnc Oct 14 05:41:41 localhost nova_compute[238069]: egl-headless Oct 14 05:41:41 localhost nova_compute[238069]: dbus Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: subsystem Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: default Oct 14 05:41:41 localhost nova_compute[238069]: mandatory Oct 14 05:41:41 localhost nova_compute[238069]: requisite Oct 14 05:41:41 localhost nova_compute[238069]: optional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: pci Oct 14 05:41:41 localhost nova_compute[238069]: scsi Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:41 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: random Oct 14 05:41:41 localhost nova_compute[238069]: egd Oct 14 05:41:41 localhost nova_compute[238069]: builtin Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: path Oct 14 05:41:41 localhost nova_compute[238069]: handle Oct 14 05:41:41 localhost nova_compute[238069]: virtiofs Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: tpm-tis Oct 14 05:41:41 localhost nova_compute[238069]: tpm-crb Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: emulator Oct 14 05:41:41 localhost nova_compute[238069]: external Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: 2.0 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: pty Oct 14 05:41:41 localhost nova_compute[238069]: unix Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: qemu Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: builtin Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: default Oct 14 05:41:41 localhost nova_compute[238069]: passt Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: isa Oct 14 05:41:41 localhost nova_compute[238069]: hyperv Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: relaxed Oct 14 05:41:41 localhost nova_compute[238069]: vapic Oct 14 05:41:41 localhost nova_compute[238069]: spinlocks Oct 14 05:41:41 localhost nova_compute[238069]: vpindex Oct 14 05:41:41 localhost nova_compute[238069]: runtime Oct 14 05:41:41 localhost nova_compute[238069]: synic Oct 14 05:41:41 localhost nova_compute[238069]: stimer Oct 14 05:41:41 localhost nova_compute[238069]: reset Oct 14 05:41:41 localhost nova_compute[238069]: vendor_id Oct 14 05:41:41 localhost nova_compute[238069]: frequencies Oct 14 05:41:41 localhost nova_compute[238069]: reenlightenment Oct 14 05:41:41 localhost nova_compute[238069]: tlbflush Oct 14 05:41:41 localhost nova_compute[238069]: ipi Oct 14 05:41:41 localhost nova_compute[238069]: avic Oct 14 05:41:41 localhost nova_compute[238069]: emsr_bitmap Oct 14 05:41:41 localhost nova_compute[238069]: xmm_input Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.003 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: /usr/libexec/qemu-kvm Oct 14 05:41:41 localhost nova_compute[238069]: kvm Oct 14 05:41:41 localhost nova_compute[238069]: pc-i440fx-rhel7.6.0 Oct 14 05:41:41 localhost nova_compute[238069]: x86_64 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: rom Oct 14 05:41:41 localhost nova_compute[238069]: pflash Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: yes Oct 14 05:41:41 localhost nova_compute[238069]: no Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: no Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: on Oct 14 05:41:41 localhost nova_compute[238069]: off Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: on Oct 14 05:41:41 localhost nova_compute[238069]: off Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:41 localhost nova_compute[238069]: AMD Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: 486 Oct 14 05:41:41 localhost nova_compute[238069]: 486-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Broadwell-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cascadelake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Conroe Oct 14 05:41:41 localhost nova_compute[238069]: Conroe-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Cooperlake Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cooperlake-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Cooperlake-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Denverton-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Dhyana Oct 14 05:41:41 localhost nova_compute[238069]: Dhyana-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Dhyana-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Genoa Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Genoa-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-IBPB Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Milan Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Milan-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Milan-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-Rome-v4 Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v1 Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v2 Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: EPYC-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: GraniteRapids Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: GraniteRapids-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: GraniteRapids-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Haswell-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-noTSX Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v6 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Icelake-Server-v7 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: IvyBridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: KnightsMill Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: KnightsMill-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Nehalem-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G1-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G2 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G2-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G3 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G3-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G4-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Opteron_G5-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Penryn Oct 14 05:41:41 localhost nova_compute[238069]: Penryn-v1 Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: SandyBridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SapphireRapids-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SierraForest Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: SierraForest-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Client-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-noTSX-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Skylake-Server-v5 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v2 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v3 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Snowridge-v4 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Westmere Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-IBRS Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Westmere-v2 Oct 14 05:41:41 localhost nova_compute[238069]: athlon Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: athlon-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: core2duo Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: core2duo-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: coreduo Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: coreduo-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: kvm32 Oct 14 05:41:41 localhost nova_compute[238069]: kvm32-v1 Oct 14 05:41:41 localhost nova_compute[238069]: kvm64 Oct 14 05:41:41 localhost nova_compute[238069]: kvm64-v1 Oct 14 05:41:41 localhost nova_compute[238069]: n270 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: n270-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: pentium Oct 14 05:41:41 localhost nova_compute[238069]: pentium-v1 Oct 14 05:41:41 localhost nova_compute[238069]: pentium2 Oct 14 05:41:41 localhost nova_compute[238069]: pentium2-v1 Oct 14 05:41:41 localhost nova_compute[238069]: pentium3 Oct 14 05:41:41 localhost nova_compute[238069]: pentium3-v1 Oct 14 05:41:41 localhost nova_compute[238069]: phenom Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: phenom-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: qemu32 Oct 14 05:41:41 localhost nova_compute[238069]: qemu32-v1 Oct 14 05:41:41 localhost nova_compute[238069]: qemu64 Oct 14 05:41:41 localhost nova_compute[238069]: qemu64-v1 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: file Oct 14 05:41:41 localhost nova_compute[238069]: anonymous Oct 14 05:41:41 localhost nova_compute[238069]: memfd Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: disk Oct 14 05:41:41 localhost nova_compute[238069]: cdrom Oct 14 05:41:41 localhost nova_compute[238069]: floppy Oct 14 05:41:41 localhost nova_compute[238069]: lun Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: ide Oct 14 05:41:41 localhost nova_compute[238069]: fdc Oct 14 05:41:41 localhost nova_compute[238069]: scsi Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: sata Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:41 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: vnc Oct 14 05:41:41 localhost nova_compute[238069]: egl-headless Oct 14 05:41:41 localhost nova_compute[238069]: dbus Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: subsystem Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: default Oct 14 05:41:41 localhost nova_compute[238069]: mandatory Oct 14 05:41:41 localhost nova_compute[238069]: requisite Oct 14 05:41:41 localhost nova_compute[238069]: optional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: pci Oct 14 05:41:41 localhost nova_compute[238069]: scsi Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: virtio Oct 14 05:41:41 localhost nova_compute[238069]: virtio-transitional Oct 14 05:41:41 localhost nova_compute[238069]: virtio-non-transitional Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: random Oct 14 05:41:41 localhost nova_compute[238069]: egd Oct 14 05:41:41 localhost nova_compute[238069]: builtin Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: path Oct 14 05:41:41 localhost nova_compute[238069]: handle Oct 14 05:41:41 localhost nova_compute[238069]: virtiofs Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: tpm-tis Oct 14 05:41:41 localhost nova_compute[238069]: tpm-crb Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: emulator Oct 14 05:41:41 localhost nova_compute[238069]: external Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: 2.0 Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: usb Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: pty Oct 14 05:41:41 localhost nova_compute[238069]: unix Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: qemu Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: builtin Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: default Oct 14 05:41:41 localhost nova_compute[238069]: passt Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: isa Oct 14 05:41:41 localhost nova_compute[238069]: hyperv Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: relaxed Oct 14 05:41:41 localhost nova_compute[238069]: vapic Oct 14 05:41:41 localhost nova_compute[238069]: spinlocks Oct 14 05:41:41 localhost nova_compute[238069]: vpindex Oct 14 05:41:41 localhost nova_compute[238069]: runtime Oct 14 05:41:41 localhost nova_compute[238069]: synic Oct 14 05:41:41 localhost nova_compute[238069]: stimer Oct 14 05:41:41 localhost nova_compute[238069]: reset Oct 14 05:41:41 localhost nova_compute[238069]: vendor_id Oct 14 05:41:41 localhost nova_compute[238069]: frequencies Oct 14 05:41:41 localhost nova_compute[238069]: reenlightenment Oct 14 05:41:41 localhost nova_compute[238069]: tlbflush Oct 14 05:41:41 localhost nova_compute[238069]: ipi Oct 14 05:41:41 localhost nova_compute[238069]: avic Oct 14 05:41:41 localhost nova_compute[238069]: emsr_bitmap Oct 14 05:41:41 localhost nova_compute[238069]: xmm_input Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: Oct 14 05:41:41 localhost nova_compute[238069]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.051 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.052 2 INFO nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Secure Boot support detected#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.054 2 INFO nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.055 2 INFO nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.068 2 DEBUG nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.108 2 INFO nova.virt.node [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Determined node identity 18c24273-aca2-4f08-be57-3188d558235e from /var/lib/nova/compute_id#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.131 2 DEBUG nova.compute.manager [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Verified node 18c24273-aca2-4f08-be57-3188d558235e matches my host np0005486733.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.160 2 DEBUG nova.compute.manager [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.165 2 DEBUG nova.virt.libvirt.vif [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:37:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005486733.localdomain',hostname='test',id=2,image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-10-14T08:37:23Z,launched_on='np0005486733.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005486733.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='41187b090f3d4818a32baa37ce8a3991',ramdisk_id='',reservation_id='r-aao7l1tg',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-10-14T08:37:23Z,user_data=None,user_id='9d85e6ce130c46ec855f37147dbb08b4',uuid=88c4e366-b765-47a6-96bf-f7677f2ce67c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.165 2 DEBUG nova.network.os_vif_util [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Converting VIF {"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.166 2 DEBUG nova.network.os_vif_util [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.167 2 DEBUG os_vif [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.202 2 DEBUG ovsdbapp.backend.ovs_idl [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.202 2 DEBUG ovsdbapp.backend.ovs_idl [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.202 2 DEBUG ovsdbapp.backend.ovs_idl [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.228 2 INFO oslo.privsep.daemon [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp1fc7pxyi/privsep.sock']#033[00m Oct 14 05:41:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52089 DF PROTO=TCP SPT=58602 DPT=9105 SEQ=2252466973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B4E1B0000000001030307) Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.867 2 INFO oslo.privsep.daemon [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.757 40 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.760 40 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.763 40 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Oct 14 05:41:41 localhost nova_compute[238069]: 2025-10-14 09:41:41.763 40 INFO oslo.privsep.daemon [-] privsep daemon running as pid 40#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.166 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec9b060-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.166 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ec9b060-f4, col_values=(('external_ids', {'iface-id': '3ec9b060-f43d-4698-9c76-6062c70911d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:5e:e5', 'vm-uuid': '88c4e366-b765-47a6-96bf-f7677f2ce67c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.167 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.168 2 INFO os_vif [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4')#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.169 2 DEBUG nova.compute.manager [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.173 2 DEBUG nova.compute.manager [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.173 2 INFO nova.compute.manager [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.401 2 DEBUG oslo_concurrency.lockutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.402 2 DEBUG oslo_concurrency.lockutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.402 2 DEBUG oslo_concurrency.lockutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.402 2 DEBUG nova.compute.resource_tracker [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.402 2 DEBUG oslo_concurrency.processutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.824 2 DEBUG oslo_concurrency.processutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.886 2 DEBUG nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:41:42 localhost nova_compute[238069]: 2025-10-14 09:41:42.886 2 DEBUG nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.082 2 WARNING nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.084 2 DEBUG nova.compute.resource_tracker [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12890MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.085 2 DEBUG oslo_concurrency.lockutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.085 2 DEBUG oslo_concurrency.lockutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:41:43 localhost python3.9[238246]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.500 2 DEBUG nova.compute.resource_tracker [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.501 2 DEBUG nova.compute.resource_tracker [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.501 2 DEBUG nova.compute.resource_tracker [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:41:43 localhost systemd[1]: Started libpod-conmon-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b.scope. Oct 14 05:41:43 localhost systemd[1]: Started libcrun container. Oct 14 05:41:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 05:41:43 localhost podman[238272]: 2025-10-14 09:41:43.529725013 +0000 UTC m=+0.103325936 container init 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 05:41:43 localhost podman[238272]: 2025-10-14 09:41:43.543185746 +0000 UTC m=+0.116786669 container start 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:41:43 localhost python3.9[238246]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Applying nova statedir ownership Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c already 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c/console.log Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/42de92eaa427bd35ce2c758a3a1fba782a57128e Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-42de92eaa427bd35ce2c758a3a1fba782a57128e Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/7dbe5bae7bc27ef07490c629ec1f09edaa9e8c135ff89c3f08f1e44f39cf5928 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/9469aff02825a9e3dcdb3ceeb358f8d540dc07c8b6e9cd975f170399051d29c3 Oct 14 05:41:43 localhost nova_compute_init[238290]: INFO:nova_statedir:Nova statedir ownership complete Oct 14 05:41:43 localhost systemd[1]: libpod-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b.scope: Deactivated successfully. Oct 14 05:41:43 localhost podman[238304]: 2025-10-14 09:41:43.64231168 +0000 UTC m=+0.031590494 container died 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:41:43 localhost podman[238304]: 2025-10-14 09:41:43.672819031 +0000 UTC m=+0.062097815 container cleanup 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3) Oct 14 05:41:43 localhost systemd[1]: libpod-conmon-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b.scope: Deactivated successfully. Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.741 2 DEBUG nova.scheduler.client.report [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.759 2 DEBUG nova.scheduler.client.report [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.759 2 DEBUG nova.compute.provider_tree [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.782 2 DEBUG nova.scheduler.client.report [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.800 2 DEBUG nova.scheduler.client.report [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_ACCELERATORS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 05:41:43 localhost nova_compute[238069]: 2025-10-14 09:41:43.836 2 DEBUG oslo_concurrency.processutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:41:44 localhost systemd[1]: session-54.scope: Deactivated successfully. Oct 14 05:41:44 localhost systemd[1]: session-54.scope: Consumed 2min 23.838s CPU time. Oct 14 05:41:44 localhost systemd-logind[760]: Session 54 logged out. Waiting for processes to exit. Oct 14 05:41:44 localhost systemd-logind[760]: Removed session 54. Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.324 2 DEBUG oslo_concurrency.processutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.330 2 DEBUG nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Oct 14 05:41:44 localhost nova_compute[238069]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.331 2 INFO nova.virt.libvirt.host [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] kernel doesn't support AMD SEV#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.332 2 DEBUG nova.compute.provider_tree [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.332 2 DEBUG nova.virt.libvirt.driver [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.352 2 DEBUG nova.scheduler.client.report [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.376 2 DEBUG nova.compute.resource_tracker [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.377 2 DEBUG oslo_concurrency.lockutils [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.377 2 DEBUG nova.service [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.405 2 DEBUG nova.service [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Oct 14 05:41:44 localhost nova_compute[238069]: 2025-10-14 09:41:44.405 2 DEBUG nova.servicegroup.drivers.db [None req-e884ec6a-dc7f-475b-964b-d7fa652c3f78 - - - - - -] DB_Driver: join new ServiceGroup member np0005486733.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Oct 14 05:41:44 localhost systemd[1]: var-lib-containers-storage-overlay-4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df-merged.mount: Deactivated successfully. Oct 14 05:41:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b-userdata-shm.mount: Deactivated successfully. Oct 14 05:41:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52090 DF PROTO=TCP SPT=58602 DPT=9105 SEQ=2252466973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B5DDA0000000001030307) Oct 14 05:41:46 localhost nova_compute[238069]: 2025-10-14 09:41:46.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61928 DF PROTO=TCP SPT=44562 DPT=9101 SEQ=1661449627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B669B0000000001030307) Oct 14 05:41:49 localhost nova_compute[238069]: 2025-10-14 09:41:49.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:50 localhost sshd[238369]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:41:50 localhost systemd-logind[760]: New session 57 of user zuul. Oct 14 05:41:50 localhost systemd[1]: Started Session 57 of User zuul. Oct 14 05:41:51 localhost nova_compute[238069]: 2025-10-14 09:41:51.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:51 localhost python3.9[238480]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:41:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44590 DF PROTO=TCP SPT=41180 DPT=9102 SEQ=3926377806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B7BA50000000001030307) Oct 14 05:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:41:53 localhost systemd[1]: tmp-crun.drDHli.mount: Deactivated successfully. Oct 14 05:41:53 localhost podman[238595]: 2025-10-14 09:41:53.732232834 +0000 UTC m=+0.071964951 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 05:41:53 localhost python3.9[238594]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:41:53 localhost podman[238595]: 2025-10-14 09:41:53.74611753 +0000 UTC m=+0.085849667 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:41:53 localhost systemd[1]: Reloading. Oct 14 05:41:53 localhost systemd-rc-local-generator[238634]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:41:53 localhost systemd-sysv-generator[238641]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:41:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:41:54 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:41:54 localhost nova_compute[238069]: 2025-10-14 09:41:54.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44591 DF PROTO=TCP SPT=41180 DPT=9102 SEQ=3926377806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B7F9A0000000001030307) Oct 14 05:41:54 localhost python3.9[238757]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:41:55 localhost network[238774]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:41:55 localhost network[238775]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:41:55 localhost network[238776]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:41:56 localhost nova_compute[238069]: 2025-10-14 09:41:56.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:41:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:41:57.746 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:41:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:41:57.747 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:41:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:41:57.748 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:41:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32535 DF PROTO=TCP SPT=43200 DPT=9882 SEQ=2698793669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B8E6B0000000001030307) Oct 14 05:41:59 localhost nova_compute[238069]: 2025-10-14 09:41:59.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:41:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:41:59 localhost systemd[1]: tmp-crun.XvF585.mount: Deactivated successfully. Oct 14 05:41:59 localhost podman[239000]: 2025-10-14 09:41:59.613584936 +0000 UTC m=+0.074264699 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 05:41:59 localhost podman[239000]: 2025-10-14 09:41:59.623919718 +0000 UTC m=+0.084599511 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 05:41:59 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:41:59 localhost python3.9[239022]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:42:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44593 DF PROTO=TCP SPT=41180 DPT=9102 SEQ=3926377806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0B975A0000000001030307) Oct 14 05:42:01 localhost nova_compute[238069]: 2025-10-14 09:42:01.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:01 localhost python3.9[239142]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:01 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation. Oct 14 05:42:01 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:42:01 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:42:01 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:42:02 localhost python3.9[239253]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:03 localhost python3.9[239363]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:42:04 localhost python3.9[239473]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.408 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.453 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.454 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.454 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.455 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:04 localhost nova_compute[238069]: 2025-10-14 09:42:04.494 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:42:05 localhost python3.9[239583]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:42:05 localhost systemd[1]: Reloading. Oct 14 05:42:05 localhost systemd-rc-local-generator[239610]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:42:05 localhost systemd-sysv-generator[239613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:42:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:42:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32538 DF PROTO=TCP SPT=43200 DPT=9882 SEQ=2698793669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BAA1A0000000001030307) Oct 14 05:42:06 localhost python3.9[239728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:42:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:42:06 localhost nova_compute[238069]: 2025-10-14 09:42:06.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:06 localhost podman[239730]: 2025-10-14 09:42:06.349859313 +0000 UTC m=+0.087043142 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:42:06 localhost podman[239730]: 2025-10-14 09:42:06.406541558 +0000 UTC m=+0.143725397 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:42:06 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:42:07 localhost python3.9[239864]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:42:07 localhost python3.9[239972]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:42:08 localhost python3.9[240082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:42:08 localhost podman[240083]: 2025-10-14 09:42:08.749272715 +0000 UTC m=+0.083611042 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Oct 14 05:42:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26430 DF PROTO=TCP SPT=35326 DPT=9105 SEQ=170115553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BB74E0000000001030307) Oct 14 05:42:08 localhost podman[240083]: 2025-10-14 09:42:08.78407698 +0000 UTC m=+0.118415267 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:42:08 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:42:09 localhost python3.9[240185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434928.1995206-362-40198004532221/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=fcdef34c7526fb72bcc01a044986b0e4adc9a3c5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:42:09 localhost nova_compute[238069]: 2025-10-14 09:42:09.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26431 DF PROTO=TCP SPT=35326 DPT=9105 SEQ=170115553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BBB5A0000000001030307) Oct 14 05:42:10 localhost python3.9[240295]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None Oct 14 05:42:11 localhost nova_compute[238069]: 2025-10-14 09:42:11.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26432 DF PROTO=TCP SPT=35326 DPT=9105 SEQ=170115553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BC35A0000000001030307) Oct 14 05:42:12 localhost python3.9[240405]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None Oct 14 05:42:12 localhost python3.9[240516]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Oct 14 05:42:14 localhost python3.9[240632]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486733.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Oct 14 05:42:14 localhost nova_compute[238069]: 2025-10-14 09:42:14.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26433 DF PROTO=TCP SPT=35326 DPT=9105 SEQ=170115553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BD31A0000000001030307) Oct 14 05:42:15 localhost python3.9[240748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:16 localhost nova_compute[238069]: 2025-10-14 09:42:16.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:16 localhost python3.9[240851]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1760434935.5260413-566-208973718493107/.source.conf _original_basename=ceilometer.conf follow=False checksum=035860cf668b88822e0cefeecfa174979afca855 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:16 localhost python3.9[240993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:17 localhost python3.9[241096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1760434936.5943587-566-28341891585227/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:18 localhost python3.9[241222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43523 DF PROTO=TCP SPT=33244 DPT=9101 SEQ=1640739443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BDBDB0000000001030307) Oct 14 05:42:18 localhost python3.9[241308]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1760434937.6183147-566-128605349655275/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:19 localhost python3.9[241416]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:42:19 localhost nova_compute[238069]: 2025-10-14 09:42:19.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:20 localhost python3.9[241524]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:42:20 localhost python3.9[241632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:21 localhost nova_compute[238069]: 2025-10-14 09:42:21.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:21 localhost python3.9[241718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434940.4640663-743-274868934615690/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:22 localhost python3.9[241826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:22 localhost python3.9[241881]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:23 localhost python3.9[241989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2964 DF PROTO=TCP SPT=35038 DPT=9102 SEQ=3421225403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BF0D50000000001030307) Oct 14 05:42:23 localhost python3.9[242075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434942.6593544-743-96925235918002/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:24 localhost python3.9[242183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2965 DF PROTO=TCP SPT=35038 DPT=9102 SEQ=3421225403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0BF4DA0000000001030307) Oct 14 05:42:24 localhost nova_compute[238069]: 2025-10-14 09:42:24.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:42:24 localhost python3.9[242269]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434943.672501-743-192728246471531/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:24 localhost podman[242270]: 2025-10-14 09:42:24.726591307 +0000 UTC m=+0.063805742 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:42:24 localhost podman[242270]: 2025-10-14 09:42:24.735958271 +0000 UTC m=+0.073172756 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3) Oct 14 05:42:24 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:42:25 localhost python3.9[242396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:25 localhost python3.9[242482]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434944.807285-743-153687319389929/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:26 localhost nova_compute[238069]: 2025-10-14 09:42:26.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:26 localhost python3.9[242590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:26 localhost python3.9[242676]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434945.9591131-743-114097084833596/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:27 localhost python3.9[242784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:28 localhost python3.9[242870]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434947.1300693-743-89959187277280/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64204 DF PROTO=TCP SPT=35758 DPT=9882 SEQ=3336112850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C039B0000000001030307) Oct 14 05:42:28 localhost python3.9[242978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:29 localhost python3.9[243064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434948.1879058-743-193465488305431/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:29 localhost nova_compute[238069]: 2025-10-14 09:42:29.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:29 localhost python3.9[243172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:30 localhost python3.9[243258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434949.2367826-743-7199697955144/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2967 DF PROTO=TCP SPT=35038 DPT=9102 SEQ=3421225403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C0C9A0000000001030307) Oct 14 05:42:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:42:30 localhost python3.9[243366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:30 localhost podman[243367]: 2025-10-14 09:42:30.755580246 +0000 UTC m=+0.098022546 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:42:30 localhost podman[243367]: 2025-10-14 09:42:30.7656574 +0000 UTC m=+0.108099670 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 05:42:30 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:42:31 localhost python3.9[243471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434950.2505765-743-1121361383965/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:31 localhost nova_compute[238069]: 2025-10-14 09:42:31.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:31 localhost python3.9[243579]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:32 localhost python3.9[243665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434951.3877077-743-35977267905629/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:33 localhost python3.9[243775]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:42:34 localhost nova_compute[238069]: 2025-10-14 09:42:34.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:34 localhost python3.9[243885]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:42:34 localhost systemd[1]: Reloading. Oct 14 05:42:34 localhost systemd-rc-local-generator[243914]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:42:34 localhost systemd-sysv-generator[243917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:42:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:42:35 localhost systemd[1]: Listening on Podman API Socket. Oct 14 05:42:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64207 DF PROTO=TCP SPT=35758 DPT=9882 SEQ=3336112850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C1F5A0000000001030307) Oct 14 05:42:35 localhost python3.9[244034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:36 localhost nova_compute[238069]: 2025-10-14 09:42:36.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:36 localhost python3.9[244122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434955.4744964-1258-64049009286097/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:42:36 localhost podman[244131]: 2025-10-14 09:42:36.762916104 +0000 UTC m=+0.098005297 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:42:36 localhost podman[244131]: 2025-10-14 09:42:36.798981545 +0000 UTC m=+0.134070708 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 05:42:36 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:42:37 localhost python3.9[244200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:37 localhost python3.9[244288]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434955.4744964-1258-64049009286097/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:42:38 localhost python3.9[244398]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False Oct 14 05:42:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51913 DF PROTO=TCP SPT=33394 DPT=9105 SEQ=1381172159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C2C7F0000000001030307) Oct 14 05:42:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:42:39 localhost systemd[1]: tmp-crun.1CPUlb.mount: Deactivated successfully. Oct 14 05:42:39 localhost podman[244509]: 2025-10-14 09:42:39.654913584 +0000 UTC m=+0.068520867 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Oct 14 05:42:39 localhost podman[244509]: 2025-10-14 09:42:39.686901587 +0000 UTC m=+0.100508820 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:42:39 localhost nova_compute[238069]: 2025-10-14 09:42:39.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:39 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:42:39 localhost python3.9[244508]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:42:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51914 DF PROTO=TCP SPT=33394 DPT=9105 SEQ=1381172159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C309A0000000001030307) Oct 14 05:42:40 localhost nova_compute[238069]: 2025-10-14 09:42:40.086 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:40 localhost nova_compute[238069]: 2025-10-14 09:42:40.086 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:40 localhost nova_compute[238069]: 2025-10-14 09:42:40.087 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:42:40 localhost nova_compute[238069]: 2025-10-14 09:42:40.087 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:42:40 localhost python3[244636]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:42:41 localhost python3[244636]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "ff8aaa87a0dadf978d112c753603163797c5ab8a31d9fdfbc1412a1a3cc6baaa",#012 "Digest": "sha256:fdfe6c13298281d9bde0044bcf6e037d1a31c741234642f0584858e76761296b",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:fdfe6c13298281d9bde0044bcf6e037d1a31c741234642f0584858e76761296b"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-14T06:21:17.025659624Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 505004291,#012 "VirtualSize": 505004291,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790/diff:/var/lib/containers/storage/overlay/1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec/diff:/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",#012 "sha256:f640179b0564dc7abbe22bd39fc8810d5bbb8e54094fe7ebc5b3c45b658c4983",#012 "sha256:a244c51d91c7fa48dd864b4fedb26f2afb3cd16eb13faecea61eec45f3182851",#012 "sha256:4da4e1be651faf4cb682c510a475353c690bc8308e24a4b892f317b994e706e4"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-14T06:08:54.969219151Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969253522Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969285133Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969308103Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969342284Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969363945Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:55.340499198Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:32.389605838Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 Oct 14 05:42:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:42:41 localhost nova_compute[238069]: 2025-10-14 09:42:41.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 4960 writes, 22K keys, 4960 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4960 writes, 649 syncs, 7.64 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:42:41 localhost podman[244687]: 2025-10-14 09:42:41.279794257 +0000 UTC m=+0.089601402 container remove def781417ff6f60ca0bb795ee153f9cd4716e075e3949752f45553ce2f08ff6e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.9, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0fa4c62fe8881d1f7112b22e9fd9421c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, release=1, com.redhat.component=openstack-ceilometer-compute-container) Oct 14 05:42:41 localhost python3[244636]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute Oct 14 05:42:41 localhost podman[244703]: Oct 14 05:42:41 localhost podman[244703]: 2025-10-14 09:42:41.350491328 +0000 UTC m=+0.056147168 container create 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, config_id=edpm) Oct 14 05:42:41 localhost podman[244703]: 2025-10-14 09:42:41.326639172 +0000 UTC m=+0.032295022 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Oct 14 05:42:41 localhost python3[244636]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start Oct 14 05:42:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51915 DF PROTO=TCP SPT=33394 DPT=9105 SEQ=1381172159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C389A0000000001030307) Oct 14 05:42:42 localhost nova_compute[238069]: 2025-10-14 09:42:42.121 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:42:42 localhost nova_compute[238069]: 2025-10-14 09:42:42.122 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:42:42 localhost nova_compute[238069]: 2025-10-14 09:42:42.122 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:42:42 localhost nova_compute[238069]: 2025-10-14 09:42:42.122 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:42:42 localhost python3.9[244851]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:42:43 localhost python3.9[244963]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.466 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.484 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.485 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.486 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.486 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.487 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.487 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.487 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.488 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.489 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.489 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.505 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.506 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.506 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.507 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.508 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:42:43 localhost python3.9[245073]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434963.1316218-1450-148751264767425/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.902 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.954 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:42:43 localhost nova_compute[238069]: 2025-10-14 09:42:43.954 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.150 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.151 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12885MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.152 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.152 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.235 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.235 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.236 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.281 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.750 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.757 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.782 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.785 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:42:44 localhost nova_compute[238069]: 2025-10-14 09:42:44.786 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:42:44 localhost python3.9[245169]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:42:44 localhost systemd[1]: Reloading. Oct 14 05:42:44 localhost systemd-rc-local-generator[245198]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:42:44 localhost systemd-sysv-generator[245203]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:42:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:42:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:42:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 5551 writes, 24K keys, 5551 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5551 writes, 763 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:42:45 localhost python3.9[245263]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:42:45 localhost systemd[1]: Reloading. Oct 14 05:42:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51916 DF PROTO=TCP SPT=33394 DPT=9105 SEQ=1381172159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C485A0000000001030307) Oct 14 05:42:45 localhost systemd-rc-local-generator[245288]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:42:45 localhost systemd-sysv-generator[245294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:42:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:42:46 localhost systemd[1]: Starting ceilometer_agent_compute container... Oct 14 05:42:46 localhost systemd[1]: Started libcrun container. Oct 14 05:42:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c68ab3764f47ae337b1b49938a6a3338e47a4e5978133565cf0aefb9a2ae56a/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Oct 14 05:42:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c68ab3764f47ae337b1b49938a6a3338e47a4e5978133565cf0aefb9a2ae56a/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Oct 14 05:42:46 localhost nova_compute[238069]: 2025-10-14 09:42:46.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:42:46 localhost podman[245303]: 2025-10-14 09:42:46.278118102 +0000 UTC m=+0.116683792 container init 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm) Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + sudo -E kolla_set_configs Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: sudo: unable to send audit message: Operation not permitted Oct 14 05:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:42:46 localhost podman[245303]: 2025-10-14 09:42:46.307777555 +0000 UTC m=+0.146343225 container start 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3) Oct 14 05:42:46 localhost podman[245303]: ceilometer_agent_compute Oct 14 05:42:46 localhost systemd[1]: Started ceilometer_agent_compute container. Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Validating config file Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Copying service configuration files Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: INFO:__main__:Writing out command to execute Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: ++ cat /run_command Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + ARGS= Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + sudo kolla_copy_cacerts Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: sudo: unable to send audit message: Operation not permitted Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + [[ ! -n '' ]] Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + . kolla_extend_start Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + umask 0022 Oct 14 05:42:46 localhost ceilometer_agent_compute[245315]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Oct 14 05:42:46 localhost podman[245324]: 2025-10-14 09:42:46.406043529 +0000 UTC m=+0.088737817 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:42:46 localhost podman[245324]: 2025-10-14 09:42:46.445158569 +0000 UTC m=+0.127852897 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:42:46 localhost podman[245324]: unhealthy Oct 14 05:42:46 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:42:46 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.119 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.119 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.119 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.119 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.120 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.121 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.122 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.123 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.124 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.125 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.126 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.127 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.128 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.130 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.131 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.132 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.133 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.134 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.134 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.134 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.150 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.151 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.152 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.257 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.325 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.325 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.325 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.325 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.325 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.325 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.326 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.327 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.328 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.329 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.330 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.331 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.332 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.333 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.334 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.335 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.336 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.337 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.338 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.339 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.340 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.341 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.342 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.343 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.344 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.347 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.351 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Oct 14 05:42:47 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:47.875 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}eb7092bd4dce9c0d70fcefff2c37bdc7c7d42c9f94f1ec507bb3b362687057e9" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.104 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Tue, 14 Oct 2025 09:42:47 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2d53a95e-839e-4cd1-836d-c9cf86f261b5 x-openstack-request-id: req-2d53a95e-839e-4cd1-836d-c9cf86f261b5 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.104 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "36e4c2a8-ca99-4c45-8719-dd5129265531", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.104 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-2d53a95e-839e-4cd1-836d-c9cf86f261b5 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.106 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}eb7092bd4dce9c0d70fcefff2c37bdc7c7d42c9f94f1ec507bb3b362687057e9" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.138 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Tue, 14 Oct 2025 09:42:48 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2dc49e05-b31f-41ce-aa80-acd138ac1375 x-openstack-request-id: req-2dc49e05-b31f-41ce-aa80-acd138ac1375 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.139 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "36e4c2a8-ca99-4c45-8719-dd5129265531", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.139 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531 used request id req-2dc49e05-b31f-41ce-aa80-acd138ac1375 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.140 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:42:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16768 DF PROTO=TCP SPT=59170 DPT=9101 SEQ=3289142554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C511A0000000001030307) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.159 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.160 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c3679f6-b54d-41ed-9451-35cf87953d5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.140813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24c73fca-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.333574252, 'message_signature': '2d36a44469ccf68124a6dcb04c75f942e742c072d96d0d9ba242e5719e5ac75a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.140813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24c7517c-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.333574252, 'message_signature': '5c8c5cf20f8e50cb77bab5e0861c39803cf304f238bfa273ba869f5021bcbb90'}]}, 'timestamp': '2025-10-14 09:42:48.160467', '_unique_id': 'ff25a267919b40ef811e0eb56eac74ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.165 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.170 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 88c4e366-b765-47a6-96bf-f7677f2ce67c / tap3ec9b060-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.170 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd3b8b40-7668-4a93-87aa-17df8000ce23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.167911', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24c8f586-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': '90e32590d7c0bf69713c0be7046232b6b722de4947e12034e202d558dda9293c'}]}, 'timestamp': '2025-10-14 09:42:48.171248', '_unique_id': '88cff9fed8b9433eb16c8bccf89799e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.172 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97b9335a-2ca1-4328-943b-fc0422a0967c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.172906', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24c9445a-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': '8468c9d0eeffd2321fb238822fccf19d380aaaae9e9e54b8649540f475278f2f'}]}, 'timestamp': '2025-10-14 09:42:48.173230', '_unique_id': 'e3663e37f9b94af48432ec7fab76e6a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.173 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:42:48 localhost python3.9[245462]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.193 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.194 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97357096-7659-4046-8780-270714947afe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.174706', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24cc703a-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '55c544a9b6ce5c78093dbfb852097a8dbdc43c09e54bfed4d6fa28edb71cd638'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.174706', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24cc7e4a-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '6656e98c6c8e92f1b92c77f1ca3a9ac71632734bd9ccd521a0cbc5b086098e40'}]}, 'timestamp': '2025-10-14 09:42:48.194371', '_unique_id': 'b64888abbdc243c884078741d04bbddf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.195 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.196 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da0c9c62-e494-4d69-ba71-24c882852589', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.196394', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24ccda16-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': 'fb17c601728b3fd9f40bc6a838e11a1d5da0838a2e3e35a4cb6e12f19979d900'}]}, 'timestamp': '2025-10-14 09:42:48.196749', '_unique_id': 'bd379eb8c6df4789acddb098ceda1363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.197 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.198 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c5a721-826b-427e-8a1f-52a992dab195', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.198219', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24cd2084-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': '44e4b766a1c5511eee9c3b95402148c8a8a55db8ce7163113aefa71849ca7ab4'}]}, 'timestamp': '2025-10-14 09:42:48.198541', '_unique_id': 'bfb1deb48b5945dcbbae1e7467c4e01a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.200 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.200 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d53473-9ba3-40d1-9527-1179350230c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.200029', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24cd67ba-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '1c2641831034593d936c22eb0bee51e4fe1edc0873013d162319d884c85d799b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.200029', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24cd735e-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': 'e83f689c9d6457ef7980bb0cf93c3c0253ce34fe58170ddd713d00d7d835121b'}]}, 'timestamp': '2025-10-14 09:42:48.200632', '_unique_id': 'ce1132baebeb4fa0852cf677a78a6bf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.201 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.202 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34006ecd-027e-4c4f-9660-06a65da58957', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.202137', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24cdbabc-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': 'edfcb709b943b5dbd91a17c1716560b69d37b1ce5ff83ce1637d848099945d51'}]}, 'timestamp': '2025-10-14 09:42:48.202473', '_unique_id': 'a085d0d43af14a3d846770f3139aaa1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.203 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.204 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '826b6f3c-2049-4d20-b4da-72bd388de5b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.203914', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24ce04cc-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': 'e3fb69251feb2da1f7ab06a11f847b26864659a54ba3a79175ca775769a09833'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.203914', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24ce0fda-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '96d8dbd53d884de6d0290476de83160dd24081788d057a0011175c8f31abccf0'}]}, 'timestamp': '2025-10-14 09:42:48.204627', '_unique_id': '8f9bfc73b12d4768b6216f7e8378bc3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:42:48 localhost systemd[1]: Stopping ceilometer_agent_compute container... Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.229 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f908c12-1aca-4704-9bb2-b2e5f40e2dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:42:48.205968', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '24d1f046-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.421876326, 'message_signature': '23dda7dfa30553924db69d1b2fad227cdaab0f1957e84c9f0250deb07c9f63cb'}]}, 'timestamp': '2025-10-14 09:42:48.230103', '_unique_id': 'cd9d9bc6e45f4344a2681de7e033663e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.231 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.232 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.232 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.233 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.233 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '798d6806-fe8f-4009-a0c0-448e9b35d5b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.233010', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24d2700c-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '93f71048818c4c15128cf6ec80a30ce37a5a5589e4895448904b833251df8b96'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.233010', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24d27d54-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': 'e7d21de78b5c41054a0e92b491877f980d4efc7e8dd220f1a78369d50bb20b85'}]}, 'timestamp': '2025-10-14 09:42:48.233654', '_unique_id': 'ca1d0edf04134f5db069ae610a954ac5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.234 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.235 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.235 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.235 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f961ebe-6ce4-4dde-9c6c-f149646cceb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.235821', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24d2de34-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': '614b3e5a81ade86ec0927cb7eeacdcf4fba984f256d4ad935164cf2ab47feb06'}]}, 'timestamp': '2025-10-14 09:42:48.236144', '_unique_id': '869485501ca848f8b6973782f5451790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.236 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.237 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.237 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0744e64b-6260-4e55-b8bc-6fd58191836a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.237651', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24d3260a-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '004490101740acbf8e0c57b2a15852119d1e2481f38f5caa2e2a473ce9caf8d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.237651', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24d33168-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '2a3782d1ceed87bad6b99711e9c915ffd3efedf51d8a3649172778653f86dd13'}]}, 'timestamp': '2025-10-14 09:42:48.238264', '_unique_id': 'f02b32f3bb7d48a0a1595fc8e4e95b57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.238 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.239 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2a7d98a-3658-4fcf-9031-a0d7bdf459b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.239824', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24d37a24-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': 'b1e8b87312c02b0a47de88b407f6c4e3d724e08363afee5f4f7806878ea26ea2'}]}, 'timestamp': '2025-10-14 09:42:48.240139', '_unique_id': '1e6d8051126f47d08eacccd54d69cd2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.240 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.241 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '015d31b7-3d9a-4d69-aaa4-a7d6a6a309af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8696, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.241889', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24d3cace-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': 'ceccb4f2c1f92104a90788a9f129379a2583cdd448814e2c040dc0a373b280be'}]}, 'timestamp': '2025-10-14 09:42:48.242202', '_unique_id': '623f075a5d61455e919daae04b39fa08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.242 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.243 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32491d4b-de8b-4b26-bdb7-fc38853d5aed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.243718', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24d412f4-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': '7cb591721c677f4a4d575636f4bfcf4ef64a9587ae365dfb88b3cf594c0fc657'}]}, 'timestamp': '2025-10-14 09:42:48.244050', '_unique_id': 'b82d0699c2904189951c6bbced037f5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.244 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.245 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c7c80c-5d2b-4b6f-9fe4-53b224f72b71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:48.245497', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '24d457aa-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.360722674, 'message_signature': 'e94a86e289bc6091d84b7e495e0681d00c96a05abebc6a48685a08aa470c026b'}]}, 'timestamp': '2025-10-14 09:42:48.245873', '_unique_id': 'c74fb5ce0d114cddbf2b31e48af74f01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.246 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.247 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.247 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46c1784e-d225-49c9-b673-2a056d4a661d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.247364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24d4a0c0-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.333574252, 'message_signature': '9c4595df1a1ccda8195ef77a7c6dd9b67f2bc4c8541b5ca5551e0e1522c4b096'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.247364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24d4aef8-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.333574252, 'message_signature': '8a9849cadca3fc5eda300142c0d71d6cb31a210d12e437c64b59396016a1a180'}]}, 'timestamp': '2025-10-14 09:42:48.248028', '_unique_id': '4ca5a668fd924358871f7fa57012c8e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.248 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.249 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.249 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ec39c1f-d407-4a2b-8776-e7c1f2fc5970', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.249496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24d4f3f4-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.333574252, 'message_signature': 'ffda4062f45b2adcc2072de3272bdf4a9a882dcba2b2e7143728c7c2885f21fd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.249496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24d5022c-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.333574252, 'message_signature': 'f20c9c976ca4a99d019f2ba8a47c87f2ae7426ffab3816c7a8ab4c5cb5d27b02'}]}, 'timestamp': '2025-10-14 09:42:48.250161', '_unique_id': 'bc7656ec48ee44edb0321d3d06b53be4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.250 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.251 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 59040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6624553-9727-40a4-98e9-6aa3e81752e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59040000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:42:48.251658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '24d54930-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.421876326, 'message_signature': 'eae926a663a35493bc8fd084c0d0c9bab18c80aa10bce31549f3613e46e9413c'}]}, 'timestamp': '2025-10-14 09:42:48.252029', '_unique_id': 'd69019772a984980b877ab8e48e8a74f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.252 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.253 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.253 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.253 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.254 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.254 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.254 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b68c8404-91f3-49e9-bc36-874c32534c6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:48.254267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24d5ae66-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': 'c007fe8e02d172a47826ced331ecfaf1e63723656867c4da33dc8845a3e2895a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:48.254267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24d5bf00-a8e2-11f0-b617-fa163e99780b', 'monotonic_time': 10984.367475811, 'message_signature': '7b80e51d85ac013192dc5d83bb035c87086e7f61dc3923935b27350a6314d80f'}]}, 'timestamp': '2025-10-14 09:42:48.255068', '_unique_id': '39da69bc80214de4beae5f2e15f30036'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.255 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.281 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.382 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.382 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308 Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.382 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12] Oct 14 05:42:48 localhost ceilometer_agent_compute[245315]: 2025-10-14 09:42:48.390 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320 Oct 14 05:42:48 localhost journal[206742]: End of file while reading data: Input/output error Oct 14 05:42:48 localhost journal[206742]: End of file while reading data: Input/output error Oct 14 05:42:48 localhost systemd[1]: libpod-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.scope: Deactivated successfully. Oct 14 05:42:48 localhost systemd[1]: libpod-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.scope: Consumed 1.283s CPU time. Oct 14 05:42:48 localhost podman[245466]: 2025-10-14 09:42:48.539170514 +0000 UTC m=+0.301353833 container died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 05:42:48 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.timer: Deactivated successfully. Oct 14 05:42:48 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:42:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d-userdata-shm.mount: Deactivated successfully. Oct 14 05:42:48 localhost systemd[1]: var-lib-containers-storage-overlay-6c68ab3764f47ae337b1b49938a6a3338e47a4e5978133565cf0aefb9a2ae56a-merged.mount: Deactivated successfully. Oct 14 05:42:48 localhost podman[245466]: 2025-10-14 09:42:48.590924642 +0000 UTC m=+0.353107911 container cleanup 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Oct 14 05:42:48 localhost podman[245466]: ceilometer_agent_compute Oct 14 05:42:48 localhost podman[245492]: 2025-10-14 09:42:48.672633513 +0000 UTC m=+0.055613242 container cleanup 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:42:48 localhost podman[245492]: ceilometer_agent_compute Oct 14 05:42:48 localhost systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully. Oct 14 05:42:48 localhost systemd[1]: Stopped ceilometer_agent_compute container. Oct 14 05:42:48 localhost systemd[1]: Starting ceilometer_agent_compute container... Oct 14 05:42:48 localhost systemd[1]: Started libcrun container. Oct 14 05:42:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c68ab3764f47ae337b1b49938a6a3338e47a4e5978133565cf0aefb9a2ae56a/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Oct 14 05:42:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c68ab3764f47ae337b1b49938a6a3338e47a4e5978133565cf0aefb9a2ae56a/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Oct 14 05:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:42:48 localhost podman[245503]: 2025-10-14 09:42:48.790885259 +0000 UTC m=+0.092388293 container init 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + sudo -E kolla_set_configs Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: sudo: unable to send audit message: Operation not permitted Oct 14 05:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:42:48 localhost podman[245503]: 2025-10-14 09:42:48.80976766 +0000 UTC m=+0.111270704 container start 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 05:42:48 localhost podman[245503]: ceilometer_agent_compute Oct 14 05:42:48 localhost systemd[1]: Started ceilometer_agent_compute container. Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Validating config file Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Copying service configuration files Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: INFO:__main__:Writing out command to execute Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: ++ cat /run_command Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + ARGS= Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + sudo kolla_copy_cacerts Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: sudo: unable to send audit message: Operation not permitted Oct 14 05:42:48 localhost podman[245525]: 2025-10-14 09:42:48.869740328 +0000 UTC m=+0.055950902 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + [[ ! -n '' ]] Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + . kolla_extend_start Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + umask 0022 Oct 14 05:42:48 localhost ceilometer_agent_compute[245517]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Oct 14 05:42:48 localhost podman[245525]: 2025-10-14 09:42:48.898818755 +0000 UTC m=+0.085029389 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:42:48 localhost podman[245525]: unhealthy Oct 14 05:42:48 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:42:48 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.617 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.617 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.617 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.617 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.617 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.617 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.618 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.619 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.620 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.621 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.622 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.623 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.624 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.625 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.626 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.627 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.628 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.629 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.630 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.630 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.630 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.630 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.646 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.646 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.647 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.656 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Oct 14 05:42:49 localhost python3.9[245658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:42:49 localhost nova_compute[238069]: 2025-10-14 09:42:49.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.777 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.777 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.777 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.777 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.778 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.778 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.778 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.778 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.778 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.778 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.779 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.780 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.781 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.782 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.783 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.784 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.785 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.786 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.787 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.788 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.789 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.789 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.789 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.789 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.789 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.789 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.790 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.791 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.792 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.793 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.794 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.795 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.796 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.796 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.796 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.796 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.796 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.796 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.797 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.798 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.798 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.798 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.801 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Oct 14 05:42:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:49.806 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.262 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}27573e74c685270de8505c4e37234a781b8aee1a26dbe7c3805401d3a8925702" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.546 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Tue, 14 Oct 2025 09:42:50 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-6834a0a4-f07f-4cf3-a691-466a63d15d35 x-openstack-request-id: req-6834a0a4-f07f-4cf3-a691-466a63d15d35 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.547 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "36e4c2a8-ca99-4c45-8719-dd5129265531", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.547 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-6834a0a4-f07f-4cf3-a691-466a63d15d35 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.549 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}27573e74c685270de8505c4e37234a781b8aee1a26dbe7c3805401d3a8925702" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.577 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Tue, 14 Oct 2025 09:42:50 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a14697b7-86c6-45d2-9cdc-0882672f18a8 x-openstack-request-id: req-a14697b7-86c6-45d2-9cdc-0882672f18a8 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.577 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "36e4c2a8-ca99-4c45-8719-dd5129265531", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.577 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/36e4c2a8-ca99-4c45-8719-dd5129265531 used request id req-a14697b7-86c6-45d2-9cdc-0882672f18a8 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.579 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.583 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 88c4e366-b765-47a6-96bf-f7677f2ce67c / tap3ec9b060-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.584 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b506f82-c29b-4e48-94ef-af6d8996ce16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.580179', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '26394bbe-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '81e2e9cbdbc6b4e3e0d74fded752bad7e2de16cd17d6dc3f950a985eda11af00'}]}, 'timestamp': '2025-10-14 09:42:50.585399', '_unique_id': 'e670618bf8a94a5b8269929dff63ec5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.594 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.600 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46a480b1-43ed-468d-8559-7b15b6cff8e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8696, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.600012', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '263ba33c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '8f1e87e2186953d53221a233a38dd48c8af9e05b7b9612e06344e6c048cc0a5d'}]}, 'timestamp': '2025-10-14 09:42:50.600639', '_unique_id': 'b98e5e860197477091c1e97f22ad6d43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.602 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.621 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.621 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bb71f32-6647-477e-9a17-686cbf9a96ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.603614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263edd68-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.79655224, 'message_signature': '1a4702b7791eb4e7ef599b3fdec2c6dcf20b90dd4ca84e66d20072b122d69943'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.603614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263ef7b2-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.79655224, 'message_signature': '20c1eafd6060a0a316e3dc942609c289ee1db7f2c1878340af19207a4161f2a9'}]}, 'timestamp': '2025-10-14 09:42:50.622460', '_unique_id': '234a8bf258d442eb86f04f3a0a78cf66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.623 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.625 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.625 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bfc53b7-f1d8-4524-ad52-37bc00f5f37b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.625395', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '263f8560-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': 'e986db95cb143eb446c958792ed3cfb5b5af6b5bcf265a7c2875599b4b7ba63a'}]}, 'timestamp': '2025-10-14 09:42:50.626057', '_unique_id': '792de283d29c49748311c9c152955f3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.627 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.628 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.628 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.628 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.629 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.649 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 59050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77df5aa1-e7e4-4f5d-ba80-50add2cb6fb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59050000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:42:50.629702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '264329d6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.840870611, 'message_signature': 'e46039f843402d4e20be6db9b1678a2f60182912a541769bdf1f70926ab28726'}]}, 'timestamp': '2025-10-14 09:42:50.650106', '_unique_id': '502c47852c2647398ba9fd0d5fbd4c7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.651 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.654 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6349ed04-d0ee-485e-8ebb-73303df2f520', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.653991', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2643e290-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '7834500a7c589a1d5004e5f372fc782a18bcfe9a2df3475cf0edc89c4e72762c'}]}, 'timestamp': '2025-10-14 09:42:50.654922', '_unique_id': 'bcf33d25be1d4f34910c758875d879cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.656 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.658 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.658 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54eeacc5-0667-4d5c-9861-3713590c3738', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.658568', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '26449780-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '70a9fb1e1f2e08444a3f15a75a7e30f68817aac754a10366c6968cbdeed592e7'}]}, 'timestamp': '2025-10-14 09:42:50.659419', '_unique_id': '8e947d0a91104f4d94d9e490ab41295c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.661 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.684 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.685 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2ad40d3-0bde-4326-8f05-b35380d56ec6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.663081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '26489a24-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': 'cba7028e2a583b619c120f79e792a86f0d0ffa77d2440aff685537bf9eebb6da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.663081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2648b068-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': 'a85af271d2db7755e6e7a74ffcc41f5b72fbb17e6b9112bd3e443544a85895f1'}]}, 'timestamp': '2025-10-14 09:42:50.686119', '_unique_id': '44c9089b7170464fbe97d623afd794b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.687 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.689 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.689 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.689 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.689 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.689 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.690 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '506e52d4-1c6f-4365-a70b-d52aff3b9e6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.690371', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '26496af8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '3c6ebfbad63f67bb29fb02662f76f91b078ac9d5362429fa3a2d8111d0749c8a'}]}, 'timestamp': '2025-10-14 09:42:50.690955', '_unique_id': '949be2d57e2b4a89907b6d685cd1acb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.692 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.693 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.693 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.693 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fe17eb7-97db-4ffe-9f80-f907a971a20a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.693348', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2649de84-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '8c9dbbaa6134c285d5682f03c88d77aa119af36108bb31a11d0de2d0224ab864'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.693348', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2649f1c6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '86c720713f9d1791080e3636c3711b45b386851e70dbe2d8f89339fda307a4f6'}]}, 'timestamp': '2025-10-14 09:42:50.694315', '_unique_id': '63c2c22e336b40a1b0495e6278052568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.695 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.696 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.696 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.697 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c71729c-4075-4909-9863-73ce0834ec32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.696851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '264a66f6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '7569a4a9d23ccb1eb2d905e44de99e71fa441bd21b542701be086c2d6c8e7c0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.696851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '264a784e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '6b8643866db4fd743a19540d054751417db372339ffce96d908d1ef2e7e1e099'}]}, 'timestamp': '2025-10-14 09:42:50.697779', '_unique_id': 'a464b753c86248adb4a17329b2182581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.698 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.700 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.700 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.700 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1bb50f7-833c-4a26-abfa-ec19a739b31f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.700184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '264ae964-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '57fe1a45109f45792ff80e92e06e9a12f3237578e95433f6443859a917c4a7e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.700184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '264afc56-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': 'b56223f582388ecec17db0e0e7a52a34241cf357dd9feeb5407632a7189173d8'}]}, 'timestamp': '2025-10-14 09:42:50.701135', '_unique_id': 'f83a6b6b68314f818e0e2351d1b7f7cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.702 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.703 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.704 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5bbfc55-b1d4-46a1-9d5b-231b14a085b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.703722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '264b741a-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.79655224, 'message_signature': '893a8352d502e9335383c93f59ffe9490a869e66433cbfe04f4c7c200448af22'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.703722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '264b8612-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.79655224, 'message_signature': 'e9ab4aa6808c107c73e05522a09431039d507c4176d1d630fcf32e0f4bb3697a'}]}, 'timestamp': '2025-10-14 09:42:50.704786', '_unique_id': '7f3542d836bd401fad495001ffa93a96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.705 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.707 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.707 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdb8d46e-d0b2-4ff3-8394-a23ab9ef0de1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.707206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '264bfbd8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '68f3b71aeb6630c530c7d151ba38a977899027b2ffabfa2928837dc49814a0b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.707206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '264c11f4-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': 'a787d4903ece2c4dc417dcea0309fff59b7f91fd9be1b7aa64778b7085c034f0'}]}, 'timestamp': '2025-10-14 09:42:50.708247', '_unique_id': '74a938d98e644f569e0f8dd39e5feb91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.709 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.710 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.710 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.711 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.711 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfac607f-7cf0-4f70-8231-c6faac539c2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.711188', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '264c9750-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '590bd3da0519db1182cfa5c3b4c3aa1ebaa27c0d23aa8d4502b16e2b1747600b'}]}, 'timestamp': '2025-10-14 09:42:50.711804', '_unique_id': '4ffe5aa1fad148e89499cdbc75133531'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.712 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.714 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.714 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aea169d4-a6d8-4924-bcd2-619db32333c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.714169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '264d0b36-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.79655224, 'message_signature': '0369b3c65342fb4931233a688be779ff0a898b39b609fee7f9c21942daecba43'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.714169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '264d1e00-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.79655224, 'message_signature': '66b9109d1f90a5da3ea10ff033f7598818c04054fa0a9134e9c86a0d4c419b43'}]}, 'timestamp': '2025-10-14 09:42:50.715103', '_unique_id': 'b84b0c3c60744ac39ee1b8d3bad1a6bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.716 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '294e980a-e580-4536-bac1-5749b3b52663', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.716949', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '264d7454-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': 'dcf35cb9cc2e3d177ca923f4f050c9798c27bd6e050a395e4c460a5ff058c657'}]}, 'timestamp': '2025-10-14 09:42:50.717235', '_unique_id': '48e62d995f9b46a68dd81800ac73830a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.717 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.718 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.718 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1062d1ef-ff6e-4f3f-9bf0-875e36acb270', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:42:50.718613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '264db6a8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.840870611, 'message_signature': 'd41364c149550ab5517e4af90614a62b28640fce98881ea8e9770930b7182f1c'}]}, 'timestamp': '2025-10-14 09:42:50.718927', '_unique_id': '281512a049b24898a910c24f7dc9bf40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.719 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.720 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.720 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbf07580-2426-4c88-80c5-02f4059eb656', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.720288', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '264df6ae-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': 'ae32bf8d775336af67321bf53a652a10558a74494fd688d66c7572bef3ab2a4c'}]}, 'timestamp': '2025-10-14 09:42:50.720573', '_unique_id': '6ed40ea5a3914f74aff365aa97830e5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.721 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56dd7bf3-f615-490f-a11f-9af3cd1e84ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:42:50.721910', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '264e3664-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.773070785, 'message_signature': '84e94f04eb6550e5e472a86d03ca217070ca43dd9d948140c2d02ef4e881e6b3'}]}, 'timestamp': '2025-10-14 09:42:50.722208', '_unique_id': 'a35ad54fc0214e68bb4adcecf7567914'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.722 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.723 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.723 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.723 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630f61d3-cba3-4809-a4c3-deb04dabbe99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:42:50.723627', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '264e7a20-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '30264a69cc1e216ea07362dd1bc802f61a4a190e49db2c6ee1c518ea7c682c00'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:42:50.723627', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '264e8448-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 10986.856030303, 'message_signature': '8abfcf16a47d83ec5afd0db328f4545ab4dec06c4dd5dbf8a0ffc38eb4271060'}]}, 'timestamp': '2025-10-14 09:42:50.724181', '_unique_id': '9c0793fe4e924461a2a63bfa82e73728'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:42:50 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:42:50.724 12 ERROR oslo_messaging.notify.messaging Oct 14 05:42:51 localhost python3.9[245752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434969.187948-1546-87013859351729/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:42:51 localhost nova_compute[238069]: 2025-10-14 09:42:51.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:52 localhost python3.9[245862]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False Oct 14 05:42:52 localhost python3.9[245972]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:42:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55301 DF PROTO=TCP SPT=41262 DPT=9102 SEQ=2552033162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C66050000000001030307) Oct 14 05:42:54 localhost python3[246082]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:42:54 localhost podman[246119]: Oct 14 05:42:54 localhost podman[246119]: 2025-10-14 09:42:54.422107007 +0000 UTC m=+0.092507486 container create aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors , config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible) Oct 14 05:42:54 localhost podman[246119]: 2025-10-14 09:42:54.376020774 +0000 UTC m=+0.046421233 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Oct 14 05:42:54 localhost python3[246082]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl Oct 14 05:42:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55302 DF PROTO=TCP SPT=41262 DPT=9102 SEQ=2552033162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C6A1A0000000001030307) Oct 14 05:42:54 localhost nova_compute[238069]: 2025-10-14 09:42:54.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:42:55 localhost systemd[1]: tmp-crun.UHKARJ.mount: Deactivated successfully. Oct 14 05:42:55 localhost podman[246264]: 2025-10-14 09:42:55.248801469 +0000 UTC m=+0.073559265 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 05:42:55 localhost podman[246264]: 2025-10-14 09:42:55.260770608 +0000 UTC m=+0.085528354 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:42:55 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:42:55 localhost python3.9[246263]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:42:56 localhost python3.9[246394]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:56 localhost nova_compute[238069]: 2025-10-14 09:42:56.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:42:56 localhost python3.9[246503]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434976.2758687-1705-163690637276254/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:42:57 localhost python3.9[246558]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:42:57 localhost systemd[1]: Reloading. Oct 14 05:42:57 localhost systemd-sysv-generator[246585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:42:57 localhost systemd-rc-local-generator[246582]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:42:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:42:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:42:57.748 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:42:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:42:57.748 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:42:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:42:57.749 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:42:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31725 DF PROTO=TCP SPT=34552 DPT=9882 SEQ=1349285534 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C78CB0000000001030307) Oct 14 05:42:58 localhost python3.9[246649]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:42:58 localhost systemd[1]: Reloading. Oct 14 05:42:58 localhost systemd-rc-local-generator[246679]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:42:58 localhost systemd-sysv-generator[246683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:42:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:42:58 localhost systemd[1]: Starting node_exporter container... Oct 14 05:42:58 localhost systemd[1]: Started libcrun container. Oct 14 05:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:42:58 localhost podman[246691]: 2025-10-14 09:42:58.97226998 +0000 UTC m=+0.137073856 container init aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.986Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.986Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.986Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.987Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.987Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.987Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice) Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/) Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$ Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:110 level=info msg="Enabled collectors" Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=arp Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=bcache Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=bonding Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=btrfs Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=conntrack Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=cpu Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=cpufreq Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=diskstats Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=edac Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=fibrechannel Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=filefd Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=filesystem Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=infiniband Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=ipvs Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=loadavg Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=mdadm Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=meminfo Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=netclass Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=netdev Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=netstat Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=nfs Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=nfsd Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=nvme Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=schedstat Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=sockstat Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=softnet Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=systemd Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=tapestats Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=udp_queues Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=vmstat Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=xfs Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.988Z caller=node_exporter.go:117 level=info collector=zfs Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.989Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100 Oct 14 05:42:58 localhost node_exporter[246705]: ts=2025-10-14T09:42:58.989Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100 Oct 14 05:42:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:42:59 localhost podman[246691]: 2025-10-14 09:42:59.011259016 +0000 UTC m=+0.176062852 container start aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:42:59 localhost podman[246691]: node_exporter Oct 14 05:42:59 localhost systemd[1]: Started node_exporter container. Oct 14 05:42:59 localhost podman[246714]: 2025-10-14 09:42:59.091751642 +0000 UTC m=+0.076014047 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:42:59 localhost podman[246714]: 2025-10-14 09:42:59.150165674 +0000 UTC m=+0.134428099 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:42:59 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:42:59 localhost nova_compute[238069]: 2025-10-14 09:42:59.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:00 localhost python3.9[246845]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:43:00 localhost systemd[1]: Stopping node_exporter container... Oct 14 05:43:00 localhost systemd[1]: libpod-aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.scope: Deactivated successfully. Oct 14 05:43:00 localhost podman[246849]: 2025-10-14 09:43:00.237600715 +0000 UTC m=+0.066027656 container stop aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:43:00 localhost podman[246849]: 2025-10-14 09:43:00.265310591 +0000 UTC m=+0.093737552 container died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:43:00 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.timer: Deactivated successfully. Oct 14 05:43:00 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:43:00 localhost systemd[1]: tmp-crun.qfkbQy.mount: Deactivated successfully. Oct 14 05:43:00 localhost podman[246849]: 2025-10-14 09:43:00.37917029 +0000 UTC m=+0.207597221 container cleanup aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:43:00 localhost podman[246849]: node_exporter Oct 14 05:43:00 localhost systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Oct 14 05:43:00 localhost podman[246876]: 2025-10-14 09:43:00.480962656 +0000 UTC m=+0.067375415 container cleanup aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:43:00 localhost podman[246876]: node_exporter Oct 14 05:43:00 localhost systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'. Oct 14 05:43:00 localhost systemd[1]: Stopped node_exporter container. Oct 14 05:43:00 localhost systemd[1]: Starting node_exporter container... Oct 14 05:43:00 localhost systemd[1]: Started libcrun container. Oct 14 05:43:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55304 DF PROTO=TCP SPT=41262 DPT=9102 SEQ=2552033162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C81DA0000000001030307) Oct 14 05:43:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:43:00 localhost podman[246890]: 2025-10-14 09:43:00.635750057 +0000 UTC m=+0.125431366 container init aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.649Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.649Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.649Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.650Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/) Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.650Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$ Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice) Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:110 level=info msg="Enabled collectors" Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=arp Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=bcache Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=bonding Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=btrfs Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=conntrack Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=cpu Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=cpufreq Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.651Z caller=node_exporter.go:117 level=info collector=diskstats Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=edac Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=fibrechannel Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=filefd Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=filesystem Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=infiniband Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=ipvs Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=loadavg Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=mdadm Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=meminfo Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=netclass Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=netdev Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=netstat Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=nfs Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=nfsd Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=nvme Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=schedstat Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=sockstat Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=softnet Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=systemd Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=tapestats Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=udp_queues Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=vmstat Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=xfs Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=node_exporter.go:117 level=info collector=zfs Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100 Oct 14 05:43:00 localhost node_exporter[246905]: ts=2025-10-14T09:43:00.652Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100 Oct 14 05:43:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:43:00 localhost podman[246890]: 2025-10-14 09:43:00.680193393 +0000 UTC m=+0.169874652 container start aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:43:00 localhost podman[246890]: node_exporter Oct 14 05:43:00 localhost systemd[1]: Started node_exporter container. Oct 14 05:43:00 localhost podman[246914]: 2025-10-14 09:43:00.763641514 +0000 UTC m=+0.075955184 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:43:00 localhost podman[246914]: 2025-10-14 09:43:00.772957356 +0000 UTC m=+0.085270966 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:43:00 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:43:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:43:00 localhost systemd[1]: tmp-crun.eJvvIu.mount: Deactivated successfully. Oct 14 05:43:00 localhost podman[246954]: 2025-10-14 09:43:00.887080861 +0000 UTC m=+0.078653053 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:43:00 localhost podman[246954]: 2025-10-14 09:43:00.903001675 +0000 UTC m=+0.094573847 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:43:00 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:43:01 localhost nova_compute[238069]: 2025-10-14 09:43:01.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:01 localhost python3.9[247065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:43:01 localhost python3.9[247153]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434980.930734-1802-98700341511096/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:43:03 localhost python3.9[247263]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False Oct 14 05:43:04 localhost python3.9[247373]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:43:04 localhost nova_compute[238069]: 2025-10-14 09:43:04.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31728 DF PROTO=TCP SPT=34552 DPT=9882 SEQ=1349285534 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0C949B0000000001030307) Oct 14 05:43:05 localhost python3[247483]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:43:06 localhost nova_compute[238069]: 2025-10-14 09:43:06.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:43:07 localhost podman[247531]: 2025-10-14 09:43:07.000987584 +0000 UTC m=+0.162596659 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Oct 14 05:43:07 localhost podman[247531]: 2025-10-14 09:43:07.024684344 +0000 UTC m=+0.186293449 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:43:07 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:43:07 localhost podman[247497]: 2025-10-14 09:43:05.702208744 +0000 UTC m=+0.029428278 image pull quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Oct 14 05:43:07 localhost podman[247589]: Oct 14 05:43:07 localhost podman[247589]: 2025-10-14 09:43:07.443774158 +0000 UTC m=+0.071200347 container create 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi , config_id=edpm, container_name=podman_exporter) Oct 14 05:43:07 localhost podman[247589]: 2025-10-14 09:43:07.409074037 +0000 UTC m=+0.036500286 image pull quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Oct 14 05:43:07 localhost python3[247483]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Oct 14 05:43:08 localhost python3.9[247737]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:43:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23911 DF PROTO=TCP SPT=45558 DPT=9105 SEQ=3414082166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CA1AE0000000001030307) Oct 14 05:43:09 localhost python3.9[247849]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:43:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23912 DF PROTO=TCP SPT=45558 DPT=9105 SEQ=3414082166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CA59A0000000001030307) Oct 14 05:43:09 localhost nova_compute[238069]: 2025-10-14 09:43:09.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:43:10 localhost systemd[1]: tmp-crun.sQLIZf.mount: Deactivated successfully. Oct 14 05:43:10 localhost podman[247958]: 2025-10-14 09:43:10.013759093 +0000 UTC m=+0.089868030 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:43:10 localhost podman[247958]: 2025-10-14 09:43:10.021958152 +0000 UTC m=+0.098067209 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:43:10 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:43:10 localhost python3.9[247959]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434989.4406047-1960-61642625024801/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:43:10 localhost python3.9[248030]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:43:10 localhost systemd[1]: Reloading. Oct 14 05:43:10 localhost systemd-sysv-generator[248050]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:43:10 localhost systemd-rc-local-generator[248047]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:43:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:43:11 localhost nova_compute[238069]: 2025-10-14 09:43:11.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:11 localhost python3.9[248121]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:43:11 localhost systemd[1]: Reloading. Oct 14 05:43:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23913 DF PROTO=TCP SPT=45558 DPT=9105 SEQ=3414082166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CAD9A0000000001030307) Oct 14 05:43:11 localhost systemd-sysv-generator[248153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:43:11 localhost systemd-rc-local-generator[248148]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:43:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:43:12 localhost systemd[1]: Starting podman_exporter container... Oct 14 05:43:12 localhost systemd[1]: tmp-crun.B3ehgJ.mount: Deactivated successfully. Oct 14 05:43:12 localhost systemd[1]: Started libcrun container. Oct 14 05:43:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:43:12 localhost podman[248162]: 2025-10-14 09:43:12.344950029 +0000 UTC m=+0.162185577 container init 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:43:12 localhost podman_exporter[248177]: ts=2025-10-14T09:43:12.360Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)" Oct 14 05:43:12 localhost podman_exporter[248177]: ts=2025-10-14T09:43:12.360Z caller=exporter.go:69 level=info msg=metrics enhanced=false Oct 14 05:43:12 localhost podman_exporter[248177]: ts=2025-10-14T09:43:12.360Z caller=handler.go:94 level=info msg="enabled collectors" Oct 14 05:43:12 localhost podman_exporter[248177]: ts=2025-10-14T09:43:12.360Z caller=handler.go:105 level=info collector=container Oct 14 05:43:12 localhost systemd[1]: Starting Podman API Service... Oct 14 05:43:12 localhost systemd[1]: Started Podman API Service. Oct 14 05:43:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:43:12 localhost podman[248162]: 2025-10-14 09:43:12.397711717 +0000 UTC m=+0.214947225 container start 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:43:12 localhost podman[248162]: podman_exporter Oct 14 05:43:12 localhost systemd[1]: Started podman_exporter container. Oct 14 05:43:12 localhost podman[248187]: time="2025-10-14T09:43:12Z" level=info msg="/usr/bin/podman filtering at log level info" Oct 14 05:43:12 localhost podman[248187]: time="2025-10-14T09:43:12Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Oct 14 05:43:12 localhost podman[248187]: time="2025-10-14T09:43:12Z" level=info msg="Setting parallel job count to 25" Oct 14 05:43:12 localhost podman[248187]: time="2025-10-14T09:43:12Z" level=info msg="Using systemd socket activation to determine API endpoint" Oct 14 05:43:12 localhost podman[248187]: time="2025-10-14T09:43:12Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\"" Oct 14 05:43:12 localhost podman[248187]: @ - - [14/Oct/2025:09:43:12 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1" Oct 14 05:43:12 localhost podman[248187]: time="2025-10-14T09:43:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:43:12 localhost podman[248193]: 2025-10-14 09:43:12.543525076 +0000 UTC m=+0.139128525 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:43:12 localhost podman[248193]: 2025-10-14 09:43:12.552877769 +0000 UTC m=+0.148481258 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:43:12 localhost podman[248193]: unhealthy Oct 14 05:43:13 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:43:13 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:43:13 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:43:13 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:43:13 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:43:14 localhost python3.9[248335]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:43:14 localhost systemd[1]: Stopping podman_exporter container... Oct 14 05:43:14 localhost systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully. Oct 14 05:43:14 localhost podman[248187]: @ - - [14/Oct/2025:09:43:12 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1" Oct 14 05:43:14 localhost systemd[1]: libpod-0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.scope: Deactivated successfully. Oct 14 05:43:14 localhost podman[248339]: 2025-10-14 09:43:14.450049877 +0000 UTC m=+0.082438014 container died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:43:14 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.timer: Deactivated successfully. Oct 14 05:43:14 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:43:14 localhost nova_compute[238069]: 2025-10-14 09:43:14.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041-userdata-shm.mount: Deactivated successfully. Oct 14 05:43:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23914 DF PROTO=TCP SPT=45558 DPT=9105 SEQ=3414082166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CBD5A0000000001030307) Oct 14 05:43:16 localhost nova_compute[238069]: 2025-10-14 09:43:16.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:16 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:16 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:43:17 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:43:17 localhost systemd[1]: var-lib-containers-storage-overlay-adff2c21e8c3962d58fc808322f02b071d63c28e2eb4db7ad39186ff8174b92c-merged.mount: Deactivated successfully. Oct 14 05:43:17 localhost podman[248339]: 2025-10-14 09:43:17.167566442 +0000 UTC m=+2.799954549 container cleanup 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:43:17 localhost podman[248339]: podman_exporter Oct 14 05:43:17 localhost podman[248354]: 2025-10-14 09:43:17.185252827 +0000 UTC m=+2.733042648 container cleanup 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:43:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38819 DF PROTO=TCP SPT=56914 DPT=9101 SEQ=1333783027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CC65A0000000001030307) Oct 14 05:43:18 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:18 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:43:19 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:19 localhost systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Oct 14 05:43:19 localhost podman[248416]: 2025-10-14 09:43:19.317715623 +0000 UTC m=+0.256968300 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:43:19 localhost podman[248416]: 2025-10-14 09:43:19.346925074 +0000 UTC m=+0.286177771 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:43:19 localhost podman[248416]: unhealthy Oct 14 05:43:19 localhost nova_compute[238069]: 2025-10-14 09:43:19.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:20 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:43:20 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:43:20 localhost podman[248427]: 2025-10-14 09:43:20.605634995 +0000 UTC m=+1.296553396 container cleanup 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:43:20 localhost podman[248427]: podman_exporter Oct 14 05:43:21 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:21 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:21 localhost nova_compute[238069]: 2025-10-14 09:43:21.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:21 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:21 localhost systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'. Oct 14 05:43:21 localhost systemd[1]: Stopped podman_exporter container. Oct 14 05:43:21 localhost systemd[1]: Starting podman_exporter container... Oct 14 05:43:21 localhost systemd[1]: Started libcrun container. Oct 14 05:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:43:21 localhost podman[248465]: 2025-10-14 09:43:21.816658168 +0000 UTC m=+0.421918518 container init 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:43:21 localhost podman_exporter[248479]: ts=2025-10-14T09:43:21.848Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)" Oct 14 05:43:21 localhost podman_exporter[248479]: ts=2025-10-14T09:43:21.849Z caller=exporter.go:69 level=info msg=metrics enhanced=false Oct 14 05:43:21 localhost podman_exporter[248479]: ts=2025-10-14T09:43:21.849Z caller=handler.go:94 level=info msg="enabled collectors" Oct 14 05:43:21 localhost podman_exporter[248479]: ts=2025-10-14T09:43:21.849Z caller=handler.go:105 level=info collector=container Oct 14 05:43:21 localhost podman[248465]: 2025-10-14 09:43:21.850819133 +0000 UTC m=+0.456079483 container start 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:43:21 localhost podman[248465]: podman_exporter Oct 14 05:43:21 localhost podman[248187]: @ - - [14/Oct/2025:09:43:21 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1" Oct 14 05:43:21 localhost podman[248187]: time="2025-10-14T09:43:21Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:43:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61311 DF PROTO=TCP SPT=35236 DPT=9102 SEQ=3353758926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CDB350000000001030307) Oct 14 05:43:24 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:43:24 localhost systemd[1]: var-lib-containers-storage-overlay-14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8-merged.mount: Deactivated successfully. Oct 14 05:43:24 localhost systemd[1]: Started podman_exporter container. Oct 14 05:43:24 localhost podman[248488]: 2025-10-14 09:43:24.457025334 +0000 UTC m=+2.603770361 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:43:24 localhost podman[248488]: 2025-10-14 09:43:24.46787654 +0000 UTC m=+2.614621527 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:43:24 localhost podman[248488]: unhealthy Oct 14 05:43:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61312 DF PROTO=TCP SPT=35236 DPT=9102 SEQ=3353758926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CDF5A0000000001030307) Oct 14 05:43:25 localhost nova_compute[238069]: 2025-10-14 09:43:25.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:25 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:43:25 localhost systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully. Oct 14 05:43:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:43:25 localhost python3.9[248641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:43:26 localhost nova_compute[238069]: 2025-10-14 09:43:26.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:26 localhost python3.9[248737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435005.2926562-2058-66174014020706/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Oct 14 05:43:27 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:27 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:43:27 localhost python3.9[248847]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False Oct 14 05:43:27 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:43:27 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:43:27 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:43:27 localhost podman[248640]: 2025-10-14 09:43:27.454539578 +0000 UTC m=+1.833183054 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 05:43:27 localhost podman[248640]: 2025-10-14 09:43:27.45906051 +0000 UTC m=+1.837704016 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid) Oct 14 05:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:43:28 localhost python3.9[248967]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:43:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48025 DF PROTO=TCP SPT=56998 DPT=9882 SEQ=2417577993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CEDFB0000000001030307) Oct 14 05:43:29 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:29 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:29 localhost python3[249077]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:43:29 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:43:30 localhost nova_compute[238069]: 2025-10-14 09:43:30.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61314 DF PROTO=TCP SPT=35236 DPT=9102 SEQ=3353758926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0CF71A0000000001030307) Oct 14 05:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:43:31 localhost podman[249090]: 2025-10-14 09:43:31.031097788 +0000 UTC m=+0.128138665 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:43:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:43:31 localhost podman[249090]: 2025-10-14 09:43:31.068453456 +0000 UTC m=+0.165494353 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:43:31 localhost nova_compute[238069]: 2025-10-14 09:43:31.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:31 localhost systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully. Oct 14 05:43:31 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:31 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:31 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:43:31 localhost podman[249124]: 2025-10-14 09:43:31.782144555 +0000 UTC m=+0.721038954 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 05:43:31 localhost podman[249124]: 2025-10-14 09:43:31.800121209 +0000 UTC m=+0.739015648 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Oct 14 05:43:32 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:32 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:34 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:34 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:43:34 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:43:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:34 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:43:35 localhost nova_compute[238069]: 2025-10-14 09:43:35.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48028 DF PROTO=TCP SPT=56998 DPT=9882 SEQ=2417577993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D09DB0000000001030307) Oct 14 05:43:36 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:36 localhost nova_compute[238069]: 2025-10-14 09:43:36.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:36 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:36 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:43:37 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:37 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:37 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:37 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:37 localhost systemd[1]: var-lib-containers-storage-overlay-9b088732740630f4022bff32bb793337565a1d5042b99108823757a338ad7c00-merged.mount: Deactivated successfully. Oct 14 05:43:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:43:37 localhost podman[249144]: 2025-10-14 09:43:37.460083025 +0000 UTC m=+0.068068369 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:43:37 localhost podman[249144]: 2025-10-14 09:43:37.534972517 +0000 UTC m=+0.142957831 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 05:43:38 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:38 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:38 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2566 DF PROTO=TCP SPT=55738 DPT=9105 SEQ=1041651934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D16DE0000000001030307) Oct 14 05:43:38 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:38 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:43:39 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:43:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2567 DF PROTO=TCP SPT=55738 DPT=9105 SEQ=1041651934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D1ADA0000000001030307) Oct 14 05:43:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:39 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:43:40 localhost nova_compute[238069]: 2025-10-14 09:43:40.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:40 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:43:40 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:40 localhost podman[249181]: 2025-10-14 09:43:40.989453537 +0000 UTC m=+0.374712411 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:43:41 localhost podman[249181]: 2025-10-14 09:43:41.171049608 +0000 UTC m=+0.556308512 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Oct 14 05:43:41 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:41 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:41 localhost nova_compute[238069]: 2025-10-14 09:43:41.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:41 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:43:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2568 DF PROTO=TCP SPT=55738 DPT=9105 SEQ=1041651934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D22DA0000000001030307) Oct 14 05:43:41 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:42 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:42 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:42 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:44 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:44 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:44 localhost nova_compute[238069]: 2025-10-14 09:43:44.721 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:44 localhost nova_compute[238069]: 2025-10-14 09:43:44.722 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:44 localhost nova_compute[238069]: 2025-10-14 09:43:44.754 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:44 localhost nova_compute[238069]: 2025-10-14 09:43:44.754 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:43:44 localhost nova_compute[238069]: 2025-10-14 09:43:44.755 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:43:44 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:43:45 localhost systemd[1]: var-lib-containers-storage-overlay-14e63e4b374fcf1d9cd6a42836a9c1d845dc465af9c3574a672d16caea6c12c8-merged.mount: Deactivated successfully. Oct 14 05:43:45 localhost nova_compute[238069]: 2025-10-14 09:43:45.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:45 localhost nova_compute[238069]: 2025-10-14 09:43:45.450 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:43:45 localhost nova_compute[238069]: 2025-10-14 09:43:45.450 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:43:45 localhost nova_compute[238069]: 2025-10-14 09:43:45.450 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:43:45 localhost nova_compute[238069]: 2025-10-14 09:43:45.450 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:43:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2569 DF PROTO=TCP SPT=55738 DPT=9105 SEQ=1041651934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D329A0000000001030307) Oct 14 05:43:45 localhost systemd[1]: var-lib-containers-storage-overlay-ab5ae649c8d55ad3b61d788cb1580bb9641b83eaaaa1a012a2f15a639ad5659a-merged.mount: Deactivated successfully. Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.642 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.655 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.655 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.656 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.656 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.657 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.657 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.657 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.658 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.658 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.658 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.673 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.674 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.674 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.674 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:43:46 localhost nova_compute[238069]: 2025-10-14 09:43:46.675 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.143 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.206 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.207 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:43:47 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.355 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.356 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12388MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.357 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.357 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:43:47 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.438 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.439 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.439 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:43:47 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.493 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.915 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.921 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.938 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.940 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:43:47 localhost nova_compute[238069]: 2025-10-14 09:43:47.940 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:43:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27002 DF PROTO=TCP SPT=34090 DPT=9101 SEQ=2726987699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D3B5A0000000001030307) Oct 14 05:43:48 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:43:48 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:49 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:49 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:50 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:43:50 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:50 localhost nova_compute[238069]: 2025-10-14 09:43:50.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:50 localhost podman[248187]: time="2025-10-14T09:43:50Z" level=error msg="Getting root fs size for \"0a931392ffb1dd9dc1600def18a62aecf9fb9e17a5158a034ff1b31f874cb92d\": getting diffsize of layer \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": unmounting layer 19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8: replacing mount point \"/var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/merged\": device or resource busy" Oct 14 05:43:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:43:50 localhost podman[249255]: 2025-10-14 09:43:50.765508337 +0000 UTC m=+0.108076261 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:43:50 localhost podman[249255]: 2025-10-14 09:43:50.794303884 +0000 UTC m=+0.136871848 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:43:50 localhost podman[249255]: unhealthy Oct 14 05:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:51 localhost nova_compute[238069]: 2025-10-14 09:43:51.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-9d54c722c2284b3655acb982a0155c0f83e97b8ed3151318b123cb4230b9dfcf-merged.mount: Deactivated successfully. Oct 14 05:43:52 localhost systemd[1]: var-lib-containers-storage-overlay-9d54c722c2284b3655acb982a0155c0f83e97b8ed3151318b123cb4230b9dfcf-merged.mount: Deactivated successfully. Oct 14 05:43:52 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:52 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:43:52 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:43:52 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:43:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:43:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60154 DF PROTO=TCP SPT=41208 DPT=9102 SEQ=1438458739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D50650000000001030307) Oct 14 05:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:54 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:54 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:43:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60155 DF PROTO=TCP SPT=41208 DPT=9102 SEQ=1438458739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D545B0000000001030307) Oct 14 05:43:55 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:43:55 localhost systemd[1]: var-lib-containers-storage-overlay-bf7b7bb08b691d902a723a4644d1ae132381580429bddbb6a2334ee503c366a0-merged.mount: Deactivated successfully. Oct 14 05:43:55 localhost nova_compute[238069]: 2025-10-14 09:43:55.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:55 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:43:55 localhost systemd[1]: var-lib-containers-storage-overlay-9b088732740630f4022bff32bb793337565a1d5042b99108823757a338ad7c00-merged.mount: Deactivated successfully. Oct 14 05:43:56 localhost nova_compute[238069]: 2025-10-14 09:43:56.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:43:57 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:43:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:43:57 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:57 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:43:57 localhost podman[249283]: 2025-10-14 09:43:57.642749572 +0000 UTC m=+0.079944315 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:43:57 localhost podman[249283]: 2025-10-14 09:43:57.651444863 +0000 UTC m=+0.088639606 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:43:57 localhost podman[249283]: unhealthy Oct 14 05:43:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:43:57.749 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:43:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:43:57.749 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:43:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:43:57.750 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:43:57 localhost podman[249095]: 2025-10-14 09:43:30.959313886 +0000 UTC m=+0.042397407 image pull quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7 Oct 14 05:43:57 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:43:57 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:43:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63772 DF PROTO=TCP SPT=48836 DPT=9882 SEQ=2716931976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D632A0000000001030307) Oct 14 05:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:43:59 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:43:59 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:59 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:43:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:43:59 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:00 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:00 localhost nova_compute[238069]: 2025-10-14 09:44:00.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:00 localhost podman[249304]: 2025-10-14 09:44:00.218726719 +0000 UTC m=+0.558593532 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid) Oct 14 05:44:00 localhost podman[249304]: 2025-10-14 09:44:00.226063549 +0000 UTC m=+0.565930412 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Oct 14 05:44:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60157 DF PROTO=TCP SPT=41208 DPT=9102 SEQ=1438458739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D6C1A0000000001030307) Oct 14 05:44:01 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:01 localhost nova_compute[238069]: 2025-10-14 09:44:01.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:02 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:44:02 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:02 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:02 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:44:02 localhost podman[249333]: 2025-10-14 09:44:02.479356541 +0000 UTC m=+0.374574667 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:44:02 localhost podman[249333]: 2025-10-14 09:44:02.493927639 +0000 UTC m=+0.389145775 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:44:03 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:03 localhost podman[248187]: time="2025-10-14T09:44:03Z" level=error msg="Getting root fs size for \"07e01fcfd802362023d73f61bf070f7faaf8dc7d785d721ab2a2a49b7eabe638\": getting diffsize of layer \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\" and its parent \"e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df\": unmounting layer 19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8: replacing mount point \"/var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/merged\": device or resource busy" Oct 14 05:44:03 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:03 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:03 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:44:03 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:03 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:04 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:04 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:04 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:44:05 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:44:05 localhost systemd[1]: var-lib-containers-storage-overlay-ab5ae649c8d55ad3b61d788cb1580bb9641b83eaaaa1a012a2f15a639ad5659a-merged.mount: Deactivated successfully. Oct 14 05:44:05 localhost nova_compute[238069]: 2025-10-14 09:44:05.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:05 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:05 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:05 localhost podman[249368]: 2025-10-14 09:44:03.76871174 +0000 UTC m=+0.038237151 image pull quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7 Oct 14 05:44:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63775 DF PROTO=TCP SPT=48836 DPT=9882 SEQ=2716931976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D7EDB0000000001030307) Oct 14 05:44:05 localhost podman[249379]: 2025-10-14 09:44:05.401266802 +0000 UTC m=+0.828520780 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Oct 14 05:44:05 localhost podman[249379]: 2025-10-14 09:44:05.444024658 +0000 UTC m=+0.871278636 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:44:06 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:06 localhost nova_compute[238069]: 2025-10-14 09:44:06.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:07 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:44:07 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:44:07 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:44:07 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:44:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58907 DF PROTO=TCP SPT=60984 DPT=9105 SEQ=735767667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D8C0E0000000001030307) Oct 14 05:44:08 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:08 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:44:09 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:44:09 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58908 DF PROTO=TCP SPT=60984 DPT=9105 SEQ=735767667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D901B0000000001030307) Oct 14 05:44:09 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:09 localhost systemd[1]: var-lib-containers-storage-overlay-9627ceeeadda540cd5b7624a5c18b2d6257f00b841a3ce165dd1352e6e7c53ee-merged.mount: Deactivated successfully. Oct 14 05:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:44:10 localhost podman[249401]: 2025-10-14 09:44:10.167890305 +0000 UTC m=+0.087692118 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:44:10 localhost podman[249401]: 2025-10-14 09:44:10.205045884 +0000 UTC m=+0.124847227 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:44:10 localhost nova_compute[238069]: 2025-10-14 09:44:10.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:10 localhost podman[249368]: Oct 14 05:44:10 localhost podman[249368]: 2025-10-14 09:44:10.559766982 +0000 UTC m=+6.829292393 container create 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41) Oct 14 05:44:10 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:44:11 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:11 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:11 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:11 localhost python3[249077]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7 Oct 14 05:44:11 localhost nova_compute[238069]: 2025-10-14 09:44:11.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:44:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58909 DF PROTO=TCP SPT=60984 DPT=9105 SEQ=735767667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0D981A0000000001030307) Oct 14 05:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-168207db095cdd373b28e32e9bd8a2aa29e7cbcdf9040af1b44bb5a093e7f31e-merged.mount: Deactivated successfully. Oct 14 05:44:12 localhost podman[249437]: 2025-10-14 09:44:12.729383917 +0000 UTC m=+0.865220214 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:44:12 localhost podman[249437]: 2025-10-14 09:44:12.735950454 +0000 UTC m=+0.871786761 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:44:14 localhost systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully. Oct 14 05:44:14 localhost systemd[1]: var-lib-containers-storage-overlay-9d54c722c2284b3655acb982a0155c0f83e97b8ed3151318b123cb4230b9dfcf-merged.mount: Deactivated successfully. Oct 14 05:44:15 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:44:15 localhost nova_compute[238069]: 2025-10-14 09:44:15.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:15 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:44:15 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:44:15 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:44:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58910 DF PROTO=TCP SPT=60984 DPT=9105 SEQ=735767667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DA7DB0000000001030307) Oct 14 05:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:44:16 localhost nova_compute[238069]: 2025-10-14 09:44:16.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:44:17 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53698 DF PROTO=TCP SPT=54936 DPT=9101 SEQ=3596984804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DB09A0000000001030307) Oct 14 05:44:18 localhost python3.9[249573]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:19 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:19 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:19 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:19 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:20 localhost nova_compute[238069]: 2025-10-14 09:44:20.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:20 localhost python3.9[249685]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:44:20 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:44:20 localhost systemd[1]: var-lib-containers-storage-overlay-bf7b7bb08b691d902a723a4644d1ae132381580429bddbb6a2334ee503c366a0-merged.mount: Deactivated successfully. Oct 14 05:44:21 localhost python3.9[249794]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435060.4654148-2215-16606445812478/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:44:21 localhost nova_compute[238069]: 2025-10-14 09:44:21.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:21 localhost python3.9[249849]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:44:21 localhost systemd[1]: Reloading. Oct 14 05:44:21 localhost systemd-rc-local-generator[249872]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:44:21 localhost systemd-sysv-generator[249877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:44:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:44:22 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:44:22 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:44:22 localhost systemd[1]: tmp-crun.rUBhjP.mount: Deactivated successfully. Oct 14 05:44:22 localhost podman[249905]: 2025-10-14 09:44:22.476383385 +0000 UTC m=+0.083260536 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:44:22 localhost podman[249905]: 2025-10-14 09:44:22.507091338 +0000 UTC m=+0.113968539 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Oct 14 05:44:22 localhost podman[249905]: unhealthy Oct 14 05:44:22 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:44:22 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:44:22 localhost python3.9[249957]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:44:22 localhost systemd[1]: Reloading. Oct 14 05:44:23 localhost systemd-rc-local-generator[249982]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:44:23 localhost systemd-sysv-generator[249986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:44:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:44:23 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:23 localhost systemd[1]: var-lib-containers-storage-overlay-9742bc003ef67d7b0fcfbbfc43cb220e80e11c29322c24576aca8c567b6093e5-merged.mount: Deactivated successfully. Oct 14 05:44:23 localhost systemd[1]: Starting openstack_network_exporter container... Oct 14 05:44:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21975 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=99064392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DC5940000000001030307) Oct 14 05:44:23 localhost systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully. Oct 14 05:44:23 localhost systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully. Oct 14 05:44:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21976 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=99064392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DC99A0000000001030307) Oct 14 05:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:25 localhost systemd[1]: Started libcrun container. Oct 14 05:44:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6505aa9ea2feda4aed4b7e257cac417e957260e4232e4ac48799ee01e6a84209/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Oct 14 05:44:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6505aa9ea2feda4aed4b7e257cac417e957260e4232e4ac48799ee01e6a84209/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff) Oct 14 05:44:25 localhost nova_compute[238069]: 2025-10-14 09:44:25.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:44:25 localhost podman[249998]: 2025-10-14 09:44:25.416438791 +0000 UTC m=+2.106470665 container init 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *bridge.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *coverage.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *datapath.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *iface.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *memory.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *ovnnorthd.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *ovn.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *ovsdbserver.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *pmd_perf.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *pmd_rxq.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: INFO 09:44:25 main.go:48: registering *vswitch.Collector Oct 14 05:44:25 localhost openstack_network_exporter[250061]: NOTICE 09:44:25 main.go:82: listening on http://:9105/metrics Oct 14 05:44:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:44:25 localhost podman[249998]: 2025-10-14 09:44:25.457443875 +0000 UTC m=+2.147475749 container start 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, release=1755695350) Oct 14 05:44:25 localhost podman[249998]: openstack_network_exporter Oct 14 05:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:26 localhost systemd[1]: Started openstack_network_exporter container. Oct 14 05:44:26 localhost podman[250071]: 2025-10-14 09:44:26.037209702 +0000 UTC m=+0.571021385 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 05:44:26 localhost podman[250071]: 2025-10-14 09:44:26.07205527 +0000 UTC m=+0.605866903 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Oct 14 05:44:26 localhost systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully. Oct 14 05:44:26 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:44:26 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:26 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:26 localhost nova_compute[238069]: 2025-10-14 09:44:26.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:27 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:27 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:44:28 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:28 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:44:28 localhost podman[250146]: 2025-10-14 09:44:28.177378951 +0000 UTC m=+0.277272930 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:44:28 localhost podman[250146]: 2025-10-14 09:44:28.212239819 +0000 UTC m=+0.312133818 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:44:28 localhost podman[250146]: unhealthy Oct 14 05:44:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21705 DF PROTO=TCP SPT=44570 DPT=9882 SEQ=2164781238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DD85A0000000001030307) Oct 14 05:44:28 localhost python3.9[250259]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:44:28 localhost systemd[1]: Stopping openstack_network_exporter container... Oct 14 05:44:28 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:44:29 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:44:29 localhost systemd[1]: libpod-799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.scope: Deactivated successfully. Oct 14 05:44:29 localhost podman[250263]: 2025-10-14 09:44:29.408562982 +0000 UTC m=+0.714071988 container died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible) Oct 14 05:44:29 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.timer: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:44:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917-userdata-shm.mount: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:29 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:30 localhost podman[250263]: 2025-10-14 09:44:30.20880637 +0000 UTC m=+1.514315396 container cleanup 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 05:44:30 localhost podman[250263]: openstack_network_exporter Oct 14 05:44:30 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:30 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:30 localhost podman[250277]: 2025-10-14 09:44:30.275134595 +0000 UTC m=+0.868169133 container cleanup 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 05:44:30 localhost nova_compute[238069]: 2025-10-14 09:44:30.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21978 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=99064392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DE15A0000000001030307) Oct 14 05:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully. Oct 14 05:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-6505aa9ea2feda4aed4b7e257cac417e957260e4232e4ac48799ee01e6a84209-merged.mount: Deactivated successfully. Oct 14 05:44:31 localhost nova_compute[238069]: 2025-10-14 09:44:31.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:32 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:44:32 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:32 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:32 localhost systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Oct 14 05:44:32 localhost podman[250290]: 2025-10-14 09:44:32.794323963 +0000 UTC m=+0.237346349 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Oct 14 05:44:32 localhost podman[250290]: 2025-10-14 09:44:32.833810631 +0000 UTC m=+0.276833047 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:44:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:44:34 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:34 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:35 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:35 localhost systemd[1]: var-lib-containers-storage-overlay-9627ceeeadda540cd5b7624a5c18b2d6257f00b841a3ce165dd1352e6e7c53ee-merged.mount: Deactivated successfully. Oct 14 05:44:35 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:44:35 localhost podman[250301]: 2025-10-14 09:44:35.318044778 +0000 UTC m=+2.529641644 container cleanup 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 05:44:35 localhost podman[250301]: openstack_network_exporter Oct 14 05:44:35 localhost podman[250320]: 2025-10-14 09:44:35.382135216 +0000 UTC m=+0.720202812 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:44:35 localhost podman[250320]: 2025-10-14 09:44:35.39023915 +0000 UTC m=+0.728306776 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:44:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21708 DF PROTO=TCP SPT=44570 DPT=9882 SEQ=2164781238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0DF41A0000000001030307) Oct 14 05:44:35 localhost nova_compute[238069]: 2025-10-14 09:44:35.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:35 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:36 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:36 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:44:36 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:44:36 localhost nova_compute[238069]: 2025-10-14 09:44:36.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:44:37 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:44:37 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:37 localhost systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'. Oct 14 05:44:37 localhost systemd[1]: Stopped openstack_network_exporter container. Oct 14 05:44:37 localhost systemd[1]: Starting openstack_network_exporter container... Oct 14 05:44:37 localhost podman[250340]: 2025-10-14 09:44:37.590887608 +0000 UTC m=+0.088701388 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS) Oct 14 05:44:37 localhost podman[250340]: 2025-10-14 09:44:37.602167317 +0000 UTC m=+0.099981127 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 05:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:38 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:44:38 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26278 DF PROTO=TCP SPT=58244 DPT=9105 SEQ=445706597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E013E0000000001030307) Oct 14 05:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:39 localhost systemd[1]: Started libcrun container. Oct 14 05:44:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6505aa9ea2feda4aed4b7e257cac417e957260e4232e4ac48799ee01e6a84209/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Oct 14 05:44:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6505aa9ea2feda4aed4b7e257cac417e957260e4232e4ac48799ee01e6a84209/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff) Oct 14 05:44:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:44:39 localhost podman[250357]: 2025-10-14 09:44:39.591139631 +0000 UTC m=+1.997318556 container init 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container) Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *bridge.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *coverage.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *datapath.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *iface.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *memory.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *ovnnorthd.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *ovn.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *ovsdbserver.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *pmd_perf.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *pmd_rxq.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: INFO 09:44:39 main.go:48: registering *vswitch.Collector Oct 14 05:44:39 localhost openstack_network_exporter[250374]: NOTICE 09:44:39 main.go:82: listening on http://:9105/metrics Oct 14 05:44:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:44:39 localhost podman[250357]: 2025-10-14 09:44:39.635606222 +0000 UTC m=+2.041785137 container start 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Oct 14 05:44:39 localhost podman[250357]: openstack_network_exporter Oct 14 05:44:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26279 DF PROTO=TCP SPT=58244 DPT=9105 SEQ=445706597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E055A0000000001030307) Oct 14 05:44:40 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:44:40 localhost systemd[1]: var-lib-containers-storage-overlay-168207db095cdd373b28e32e9bd8a2aa29e7cbcdf9040af1b44bb5a093e7f31e-merged.mount: Deactivated successfully. Oct 14 05:44:40 localhost systemd[1]: Started openstack_network_exporter container. Oct 14 05:44:40 localhost podman[250384]: 2025-10-14 09:44:40.399828254 +0000 UTC m=+0.761085155 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350) Oct 14 05:44:40 localhost podman[250384]: 2025-10-14 09:44:40.420203271 +0000 UTC m=+0.781460102 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64) Oct 14 05:44:40 localhost nova_compute[238069]: 2025-10-14 09:44:40.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:44:41 localhost python3.9[250520]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:44:41 localhost nova_compute[238069]: 2025-10-14 09:44:41.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26280 DF PROTO=TCP SPT=58244 DPT=9105 SEQ=445706597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E0D5A0000000001030307) Oct 14 05:44:42 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:42 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:44:42 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:44:43 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:44:43 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:44:43 localhost podman[250457]: 2025-10-14 09:44:43.302070481 +0000 UTC m=+2.644437622 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:44:43 localhost podman[250457]: 2025-10-14 09:44:43.344952782 +0000 UTC m=+2.687319843 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:44:45 localhost nova_compute[238069]: 2025-10-14 09:44:45.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully. Oct 14 05:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:44:45 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:44:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26281 DF PROTO=TCP SPT=58244 DPT=9105 SEQ=445706597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E1D1A0000000001030307) Oct 14 05:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully. Oct 14 05:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:46 localhost nova_compute[238069]: 2025-10-14 09:44:46.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully. Oct 14 05:44:47 localhost nova_compute[238069]: 2025-10-14 09:44:47.943 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:47 localhost nova_compute[238069]: 2025-10-14 09:44:47.943 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:47 localhost nova_compute[238069]: 2025-10-14 09:44:47.943 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:44:47 localhost nova_compute[238069]: 2025-10-14 09:44:47.943 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:44:48 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:44:48 localhost systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully. Oct 14 05:44:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12445 DF PROTO=TCP SPT=43092 DPT=9101 SEQ=689520805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E25DA0000000001030307) Oct 14 05:44:48 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:48 localhost nova_compute[238069]: 2025-10-14 09:44:48.529 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:44:48 localhost nova_compute[238069]: 2025-10-14 09:44:48.529 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:44:48 localhost nova_compute[238069]: 2025-10-14 09:44:48.529 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:44:48 localhost nova_compute[238069]: 2025-10-14 09:44:48.529 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:44:49 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.598 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:44:49 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.615 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.616 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.616 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.617 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.617 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.617 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.618 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.618 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.618 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.618 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.638 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.638 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.639 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.639 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:44:49 localhost nova_compute[238069]: 2025-10-14 09:44:49.639 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:44:49 localhost systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully. Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.810 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.812 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.817 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2113af2e-9276-4afb-a4b6-0970a29f6a1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.812267', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d4accb2-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '6cf390d76b521b21f80e6242ef9fe73a2966cf28165e441668f50e410981b62f'}]}, 'timestamp': '2025-10-14 09:44:49.818321', '_unique_id': 'cf325cc2a37b44708365dd71409b698e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.820 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.822 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3215414-4b23-4903-bc2a-c7d605bb5799', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.822124', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d4b7c98-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '32381c14044bf5773d67a5d386098735f2438237c6849e71dfe1e20684cd0d7f'}]}, 'timestamp': '2025-10-14 09:44:49.822720', '_unique_id': '08ae09eb277849878cfa0db5713e224b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.824 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.825 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77b4e1ba-b356-466d-bd34-b0a022fc42de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.825571', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d4c021c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': 'eb31a0fa6f3fb467083212c0d74545b5cf694ef5165fda65b9c40c37edc0a36e'}]}, 'timestamp': '2025-10-14 09:44:49.826172', '_unique_id': 'a68632957b5543f6aa815892b587dbc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.827 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.828 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cf3da82-d749-410d-809a-a919b51c76f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.828852', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d4c80b6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '96373c812fe92a53af6d23038a115f86302fa9282607e741e5e72b3a4410ea21'}]}, 'timestamp': '2025-10-14 09:44:49.829438', '_unique_id': '466ad033f88e4394afeae2e28d20c3ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.830 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.848 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.849 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6210829e-353c-43d3-b5ff-6feb57d7ef1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.831920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d4f9dfa-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '3eaa835d6eed1aeacd2cdbcc437496449e42611ba73d952555127787360d2af6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.831920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d4fb452-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': 'af5b489990032669e128a516db4fa9fb3a3db53a65b84f66275834e5c88206ff'}]}, 'timestamp': '2025-10-14 09:44:49.850299', '_unique_id': 'b19922ab757045ddbc0ea6ef8dfafe23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.853 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.857 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c179245f-a978-4ee8-a78f-8c33341e567c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.857747', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d50f22c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '04453e1d982b6e7057416b9e0c3035cfe2881cc8488c0e5e3a2150798adc07f1'}]}, 'timestamp': '2025-10-14 09:44:49.858568', '_unique_id': '1283eaf0431148389681419e26b527d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.889 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.890 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12229fb0-0101-48e0-9d6c-ba2069f1bde4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.861491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d55e354-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.054244314, 'message_signature': 'f229f743791cfb9ef80906ea7711e7dbfb012b231bd6afbba2d03e90e0668053'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.861491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d55f948-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.054244314, 'message_signature': '628c8f77768b99f0e62453ee49eec5c3fa40c8c5bf05daa1d654c7bd9be075ad'}]}, 'timestamp': '2025-10-14 09:44:49.891384', '_unique_id': '2bc1b177e5554b3d94380b4bb1e3af5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e0e1476-62f8-4a43-8441-6a31d88324bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8696, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.894388', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d56816a-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '3ee472206e887a5b1b174b73cae9dce23b1b06062e0c8c439e31d43066a960b3'}]}, 'timestamp': '2025-10-14 09:44:49.894847', '_unique_id': '22b900c8422e46719ca49fa8a571098d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.896 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.896 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5005fa6-e56e-4992-bf65-f2519bad645f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.896253', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d56c684-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': 'f04908c247135ef3e95b40ec0e2e2bd54c088ef17eec67a561936782dda581c4'}]}, 'timestamp': '2025-10-14 09:44:49.896566', '_unique_id': 'd5fa276d0f7f45c4a1774ade860424b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bea5f9d0-42cc-4bf7-90a9-d4a2cad104bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.897965', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d570950-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '7ab865c983af79cfe5f262ef41e3930682594d87bb0971f34a84a8f830c8b949'}]}, 'timestamp': '2025-10-14 09:44:49.898295', '_unique_id': '625dfdf609754cf9974cf97461d4375e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.899 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0c59efc-5204-476a-8e86-4fc0a6a9ba6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:44:49.899799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6d5a40d4-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.111245317, 'message_signature': 'ae5ebafbbcfc33a06c1da7cf10646e51c7b3ce2735163158658e6386f715c5f9'}]}, 'timestamp': '2025-10-14 09:44:49.919530', '_unique_id': 'fbf1d9bbdb3946fc9260182c9cf554c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '652d205b-fe76-42af-89f7-f8b64bdfe0d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.922496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5acbe4-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.054244314, 'message_signature': '36dab16d881cc7e7b8fd403dec78e9d73d52cda07c06152237f5995325cfcfaf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.922496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5adbb6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.054244314, 'message_signature': '8cb30e7933e65ad942945b93b01fde0fa2a566194d40e57a9bbd0d05d396887c'}]}, 'timestamp': '2025-10-14 09:44:49.923389', '_unique_id': 'a9b3b1926d9744268351e5f85149c68b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d635604-f7fc-4466-a6df-eefb1f96e70f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.925922', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5b503c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '9388e8299a23b89729ec2bea16d2ec5ae1579726317dbe06600af8e05de3024b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.925922', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5b6162-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': 'e7a185fa96f4709afe88c2449c66e09e437f4bc21b28d4966c558af9a0b01015'}]}, 'timestamp': '2025-10-14 09:44:49.926778', '_unique_id': '63c4532b512a4a52a56ad236c452c324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae6a189d-87c7-4dde-9c4c-c1d1e0c94107', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.928244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5ba7d0-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': 'c4efbb1dbc851db4aa37f29df90603a3f1cfa1269d8939ddb4365e52b4cdb55f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.928244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5bb270-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '1551b9dcff0ed7630e2b56452954baf2a8021b1886c220d0850d5e69e19d9ce0'}]}, 'timestamp': '2025-10-14 09:44:49.928820', '_unique_id': 'e17caa1197e74d10944cfbb9d966eddf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f20e0712-e3a5-46d8-bafc-8b8efd711afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.930362', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5bfad2-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': 'd3d0870fec0299d304bcc2b9a416d586cb4f420b1ec37ddb9bf3c323796977f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.930362', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5c069e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '557d0db827fa479d27eeace998ed7c99c4f6da14354cef1cf2c94476a7ebcce7'}]}, 'timestamp': '2025-10-14 09:44:49.930956', '_unique_id': '6d627801700749108bf34d2a8c37f1e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702822ca-13f2-48fb-8a4f-47e06c5c6d75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.932397', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5c4a64-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '3d19ad8ed08f4b1cb0dc4d4135bea384de7e85f0454d598acdc4596665661f23'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.932397', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5c5626-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '48bcb1ccd5cfb0e19b9eda29e52acddf611f13cb039ef13d17951f7fd8d9f29e'}]}, 'timestamp': '2025-10-14 09:44:49.932992', '_unique_id': 'b160123902e04c8a97e83d5da09466ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.934 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.934 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '823680a7-b420-44ec-8eb3-21ebb9f9994a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.934491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5ca158-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.054244314, 'message_signature': 'c28de1ee0e9165cfe468342c450bcbbe61ecacf49247e29d717a184572011b84'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.934491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5cac2a-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.054244314, 'message_signature': '3711851c45b0e9d2aa7a0375f9823a945615801ad8747fde747df71e9c9e353e'}]}, 'timestamp': '2025-10-14 09:44:49.935194', '_unique_id': '9d5d7d90ba564d808e0d961dd2f5f90c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.936 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab8175ec-1836-4ad5-b229-9da430f07244', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.936709', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d5cf3ce-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '8fe28cdae49c3ebd4ffcfbf1fb28e33463e0614ad80a838d99e0502b2a440854'}]}, 'timestamp': '2025-10-14 09:44:49.937054', '_unique_id': '966f9672063d47618f50654b528116bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.938 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '153ea3be-5939-4fae-9742-68ffa42173a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:44:49.938591', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d5d3d02-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '35bd3ab3eccdd45b81eabaf6daab34a4ab3d48791509d5f9bcf751f5c38aea02'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:44:49.938591', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d5d47b6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.024665028, 'message_signature': '12201d9dc54bc40b81fbe8b7fccd85d02f7aa016bffeb3a8203ecbac3ddc27e7'}]}, 'timestamp': '2025-10-14 09:44:49.939174', '_unique_id': '32b6aa925f9c4948a5aec0732c9d55bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.940 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.940 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.940 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 60190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efcec54a-20a2-4e09-aa14-918c851b1728', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60190000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:44:49.940864', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6d5d9540-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.111245317, 'message_signature': 'ae664e0517f62ff13846f984211ab91adf30bc1e00bcc1e45bee2944076ea2a5'}]}, 'timestamp': '2025-10-14 09:44:49.941175', '_unique_id': '610f408187cf49db98617da0561f822a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.942 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb678663-86c7-4c2d-8e36-6eae57ba32c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:44:49.942609', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6d5dda6e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11106.005038114, 'message_signature': '9ed8feafed13b7b13bea87b8af19a7cb44f2b0c63db01b49e3299b511f927c95'}]}, 'timestamp': '2025-10-14 09:44:49.942953', '_unique_id': '7ce1e20a740f4b52b5e588c559d7d1d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:44:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:44:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:44:49 localhost systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully. Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.059 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.140 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.140 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.334 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.335 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12419MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.335 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.336 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.426 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.427 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.427 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.468 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.926 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.933 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.948 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.949 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:44:50 localhost nova_compute[238069]: 2025-10-14 09:44:50.949 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:44:51 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:51 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-9742bc003ef67d7b0fcfbbfc43cb220e80e11c29322c24576aca8c567b6093e5-merged.mount: Deactivated successfully. Oct 14 05:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:51 localhost nova_compute[238069]: 2025-10-14 09:44:51.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:52 localhost systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully. Oct 14 05:44:52 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:44:52 localhost systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully. Oct 14 05:44:52 localhost systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully. Oct 14 05:44:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:44:52 localhost podman[250609]: 2025-10-14 09:44:52.996262078 +0000 UTC m=+0.084403422 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 05:44:53 localhost podman[250609]: 2025-10-14 09:44:53.025811093 +0000 UTC m=+0.113952417 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 05:44:53 localhost podman[250609]: unhealthy Oct 14 05:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:44:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48249 DF PROTO=TCP SPT=37338 DPT=9102 SEQ=3995250166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E3AC50000000001030307) Oct 14 05:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully. Oct 14 05:44:53 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:44:53 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:44:53 localhost podman[250553]: 2025-10-14 09:44:53.896038512 +0000 UTC m=+8.231601887 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 05:44:53 localhost podman[250553]: 2025-10-14 09:44:53.900841101 +0000 UTC m=+8.236404496 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 05:44:54 localhost systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully. Oct 14 05:44:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48250 DF PROTO=TCP SPT=37338 DPT=9102 SEQ=3995250166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E3EDA0000000001030307) Oct 14 05:44:55 localhost nova_compute[238069]: 2025-10-14 09:44:55.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:56 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:44:56 localhost nova_compute[238069]: 2025-10-14 09:44:56.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:44:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:44:57.750 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:44:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:44:57.751 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:44:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:44:57.753 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:44:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48646 DF PROTO=TCP SPT=44758 DPT=9882 SEQ=3152027736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E4D8A0000000001030307) Oct 14 05:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:44:58 localhost podman[248187]: time="2025-10-14T09:44:58Z" level=error msg="Getting root fs size for \"2ad1916047d01d96a60b1f5a470426e33b626a18b065b35f6dd0e1af52e6181e\": getting diffsize of layer \"93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c\" and its parent \"8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac\": unmounting layer 93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c: replacing mount point \"/var/lib/containers/storage/overlay/93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c/merged\": device or resource busy" Oct 14 05:44:58 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:58 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:44:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:44:59 localhost podman[250631]: 2025-10-14 09:44:59.75822118 +0000 UTC m=+0.094813497 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:44:59 localhost podman[250631]: 2025-10-14 09:44:59.792502584 +0000 UTC m=+0.129094841 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:44:59 localhost podman[250631]: unhealthy Oct 14 05:45:00 localhost nova_compute[238069]: 2025-10-14 09:45:00.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48252 DF PROTO=TCP SPT=37338 DPT=9102 SEQ=3995250166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E569A0000000001030307) Oct 14 05:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-26c7b014151266d081bb0d73c08f9962548726db348c3c8122e5e1462f16ca73-merged.mount: Deactivated successfully. Oct 14 05:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:45:01 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:01 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:45:01 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:45:02 localhost nova_compute[238069]: 2025-10-14 09:45:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:02 localhost systemd[1]: var-lib-containers-storage-overlay-26c7b014151266d081bb0d73c08f9962548726db348c3c8122e5e1462f16ca73-merged.mount: Deactivated successfully. Oct 14 05:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:04 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:04 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:05 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:05 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:45:05 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:05 localhost podman[250654]: 2025-10-14 09:45:05.416479179 +0000 UTC m=+0.085884148 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, managed_by=edpm_ansible, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:45:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48649 DF PROTO=TCP SPT=44758 DPT=9882 SEQ=3152027736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E695B0000000001030307) Oct 14 05:45:05 localhost podman[250654]: 2025-10-14 09:45:05.42928552 +0000 UTC m=+0.098690529 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 05:45:05 localhost nova_compute[238069]: 2025-10-14 09:45:05.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:06 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:45:07 localhost nova_compute[238069]: 2025-10-14 09:45:07.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:07 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:07 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:45:07 localhost podman[250673]: 2025-10-14 09:45:07.630838944 +0000 UTC m=+0.051828363 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:45:07 localhost podman[250673]: 2025-10-14 09:45:07.665643012 +0000 UTC m=+0.086632451 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:45:08 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:45:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4914 DF PROTO=TCP SPT=39274 DPT=9105 SEQ=3975453222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E766E0000000001030307) Oct 14 05:45:09 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:45:09 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:09 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:09 localhost podman[250695]: 2025-10-14 09:45:09.317814447 +0000 UTC m=+0.101788067 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Oct 14 05:45:09 localhost podman[250695]: 2025-10-14 09:45:09.327073546 +0000 UTC m=+0.111047146 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd) Oct 14 05:45:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4915 DF PROTO=TCP SPT=39274 DPT=9105 SEQ=3975453222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E7A5B0000000001030307) Oct 14 05:45:10 localhost systemd[1]: tmp-crun.RfAYYS.mount: Deactivated successfully. Oct 14 05:45:10 localhost nova_compute[238069]: 2025-10-14 09:45:10.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully. Oct 14 05:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully. Oct 14 05:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully. Oct 14 05:45:10 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:45:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4916 DF PROTO=TCP SPT=39274 DPT=9105 SEQ=3975453222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E825A0000000001030307) Oct 14 05:45:12 localhost nova_compute[238069]: 2025-10-14 09:45:12.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully. Oct 14 05:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully. Oct 14 05:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully. Oct 14 05:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:45:13 localhost podman[250711]: 2025-10-14 09:45:13.488115089 +0000 UTC m=+0.075125931 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.) Oct 14 05:45:13 localhost podman[250711]: 2025-10-14 09:45:13.504185292 +0000 UTC m=+0.091196224 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container) Oct 14 05:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-14055c4da6e79e5c651346093268013a82ff4f993f6084d00241e7b8936c8586-merged.mount: Deactivated successfully. Oct 14 05:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully. Oct 14 05:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully. Oct 14 05:45:14 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:45:15 localhost nova_compute[238069]: 2025-10-14 09:45:15.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:15 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:45:15 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:45:15 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:45:15 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:45:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:45:15 localhost podman[250731]: 2025-10-14 09:45:15.807112877 +0000 UTC m=+0.066723209 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 05:45:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4917 DF PROTO=TCP SPT=39274 DPT=9105 SEQ=3975453222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E921B0000000001030307) Oct 14 05:45:15 localhost podman[250731]: 2025-10-14 09:45:15.871033947 +0000 UTC m=+0.130644269 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:45:16 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:45:16 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:45:16 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:45:17 localhost nova_compute[238069]: 2025-10-14 09:45:17.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:17 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:45:17 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:45:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24650 DF PROTO=TCP SPT=43492 DPT=9101 SEQ=334800655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0E9B1B0000000001030307) Oct 14 05:45:19 localhost systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully. Oct 14 05:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:20 localhost nova_compute[238069]: 2025-10-14 09:45:20.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:45:22 localhost nova_compute[238069]: 2025-10-14 09:45:22.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:22 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:22 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:22 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27688 DF PROTO=TCP SPT=42216 DPT=9102 SEQ=2289261163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EAFF50000000001030307) Oct 14 05:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:45:24 localhost podman[250757]: 2025-10-14 09:45:24.175963117 +0000 UTC m=+0.090397730 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 05:45:24 localhost podman[250757]: 2025-10-14 09:45:24.185222856 +0000 UTC m=+0.099657489 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm) Oct 14 05:45:24 localhost podman[250757]: unhealthy Oct 14 05:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully. Oct 14 05:45:24 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:45:24 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:45:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27689 DF PROTO=TCP SPT=42216 DPT=9102 SEQ=2289261163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EB41A0000000001030307) Oct 14 05:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:25 localhost nova_compute[238069]: 2025-10-14 09:45:25.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:26 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:26 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:45:26 localhost podman[250774]: 2025-10-14 09:45:26.5884476 +0000 UTC m=+0.105369788 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true) Oct 14 05:45:26 localhost podman[250774]: 2025-10-14 09:45:26.593769317 +0000 UTC m=+0.110691505 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Oct 14 05:45:27 localhost nova_compute[238069]: 2025-10-14 09:45:27.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:27 localhost systemd[1]: tmp-crun.qiB5uj.mount: Deactivated successfully. Oct 14 05:45:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41762 DF PROTO=TCP SPT=60532 DPT=9882 SEQ=1963461340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EC2BB0000000001030307) Oct 14 05:45:28 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:28 localhost systemd[1]: var-lib-containers-storage-overlay-705239d69edbb97c498d74570c74cd8434e37024eb25add7e89f19b22fc90898-merged.mount: Deactivated successfully. Oct 14 05:45:28 localhost systemd[1]: var-lib-containers-storage-overlay-705239d69edbb97c498d74570c74cd8434e37024eb25add7e89f19b22fc90898-merged.mount: Deactivated successfully. Oct 14 05:45:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:28 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:45:29 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:29 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:29 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:30 localhost nova_compute[238069]: 2025-10-14 09:45:30.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27691 DF PROTO=TCP SPT=42216 DPT=9102 SEQ=2289261163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0ECBDA0000000001030307) Oct 14 05:45:31 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:45:31 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:31 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:31 localhost podman[250841]: 2025-10-14 09:45:31.776766945 +0000 UTC m=+0.414193281 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:45:31 localhost podman[250841]: 2025-10-14 09:45:31.815257819 +0000 UTC m=+0.452684175 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:45:31 localhost podman[250841]: unhealthy Oct 14 05:45:32 localhost nova_compute[238069]: 2025-10-14 09:45:32.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:32 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:32 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:32 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:33 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:33 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:33 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:33 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:45:33 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:45:33 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:33 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:34 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41765 DF PROTO=TCP SPT=60532 DPT=9882 SEQ=1963461340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EDE9A0000000001030307) Oct 14 05:45:35 localhost nova_compute[238069]: 2025-10-14 09:45:35.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:36 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:36 localhost systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully. Oct 14 05:45:36 localhost systemd[1]: var-lib-containers-storage-overlay-26c7b014151266d081bb0d73c08f9962548726db348c3c8122e5e1462f16ca73-merged.mount: Deactivated successfully. Oct 14 05:45:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:45:36 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:36 localhost podman[250901]: 2025-10-14 09:45:36.60712473 +0000 UTC m=+0.071213429 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:45:36 localhost podman[250901]: 2025-10-14 09:45:36.616140622 +0000 UTC m=+0.080229321 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid) Oct 14 05:45:37 localhost nova_compute[238069]: 2025-10-14 09:45:37.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:45:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21779 DF PROTO=TCP SPT=35756 DPT=9105 SEQ=1773838706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EEB9F0000000001030307) Oct 14 05:45:39 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:39 localhost systemd[1]: var-lib-containers-storage-overlay-082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287-merged.mount: Deactivated successfully. Oct 14 05:45:39 localhost systemd[1]: var-lib-containers-storage-overlay-082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287-merged.mount: Deactivated successfully. Oct 14 05:45:39 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:45:39 localhost podman[250921]: 2025-10-14 09:45:39.325136282 +0000 UTC m=+0.663038066 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:45:39 localhost podman[250921]: 2025-10-14 09:45:39.338232742 +0000 UTC m=+0.676134596 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:45:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21780 DF PROTO=TCP SPT=35756 DPT=9105 SEQ=1773838706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EEF9A0000000001030307) Oct 14 05:45:40 localhost nova_compute[238069]: 2025-10-14 09:45:40.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:45:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21781 DF PROTO=TCP SPT=35756 DPT=9105 SEQ=1773838706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0EF79A0000000001030307) Oct 14 05:45:42 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:42 localhost nova_compute[238069]: 2025-10-14 09:45:42.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:42 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:42 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:42 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:45:42 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:42 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:45:42 localhost podman[250945]: 2025-10-14 09:45:42.604469828 +0000 UTC m=+0.945855436 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 05:45:42 localhost podman[250945]: 2025-10-14 09:45:42.61827962 +0000 UTC m=+0.959665278 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.026 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.074 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.074 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.074 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.389 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.389 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.390 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.390 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.893 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.910 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.911 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.911 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.912 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.912 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.912 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.932 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.933 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.933 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.934 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:45:43 localhost nova_compute[238069]: 2025-10-14 09:45:43.934 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.386 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.460 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.461 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:45:44 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:44 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:44 localhost podman[248187]: time="2025-10-14T09:45:44Z" level=error msg="Getting root fs size for \"2d882b1ac5380c199c3aff4015df0158283c9f5ad4c3e7c158e231d24ef57d7f\": getting diffsize of layer \"948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca\" and its parent \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\": unmounting layer 948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca: replacing mount point \"/var/lib/containers/storage/overlay/948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca/merged\": device or resource busy" Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.667 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.668 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12330MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.669 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.669 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.744 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.745 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.745 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:45:44 localhost nova_compute[238069]: 2025-10-14 09:45:44.798 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:45:44 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:45:44 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:45:44 localhost podman[250987]: 2025-10-14 09:45:44.986748374 +0000 UTC m=+0.108771355 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container) Oct 14 05:45:45 localhost podman[250987]: 2025-10-14 09:45:45.009203348 +0000 UTC m=+0.131226349 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9) Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.275 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.283 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.305 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.308 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.308 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.421 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.422 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:45 localhost nova_compute[238069]: 2025-10-14 09:45:45.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21782 DF PROTO=TCP SPT=35756 DPT=9105 SEQ=1773838706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F075A0000000001030307) Oct 14 05:45:45 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:45 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:46 localhost nova_compute[238069]: 2025-10-14 09:45:46.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:46 localhost nova_compute[238069]: 2025-10-14 09:45:46.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:45:46 localhost nova_compute[238069]: 2025-10-14 09:45:46.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:45:46 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:45:47 localhost nova_compute[238069]: 2025-10-14 09:45:47.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:47 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:47 localhost systemd[1]: var-lib-containers-storage-overlay-14055c4da6e79e5c651346093268013a82ff4f993f6084d00241e7b8936c8586-merged.mount: Deactivated successfully. Oct 14 05:45:47 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:45:47 localhost podman[251026]: 2025-10-14 09:45:47.75735035 +0000 UTC m=+1.558169050 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:45:47 localhost podman[251026]: 2025-10-14 09:45:47.79378161 +0000 UTC m=+1.594600260 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Oct 14 05:45:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45037 DF PROTO=TCP SPT=54130 DPT=9101 SEQ=2104763980 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F101A0000000001030307) Oct 14 05:45:48 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:48 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:48 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:45:48 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:45:48 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:45:49 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:50 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:45:50 localhost nova_compute[238069]: 2025-10-14 09:45:50.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:52 localhost nova_compute[238069]: 2025-10-14 09:45:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:52 localhost systemd[1]: var-lib-containers-storage-overlay-a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4-merged.mount: Deactivated successfully. Oct 14 05:45:52 localhost systemd[1]: var-lib-containers-storage-overlay-a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4-merged.mount: Deactivated successfully. Oct 14 05:45:52 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:52 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:45:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36540 DF PROTO=TCP SPT=42744 DPT=9102 SEQ=182524126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F25250000000001030307) Oct 14 05:45:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36541 DF PROTO=TCP SPT=42744 DPT=9102 SEQ=182524126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F291A0000000001030307) Oct 14 05:45:54 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:45:54 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:45:54 localhost podman[251051]: 2025-10-14 09:45:54.724964748 +0000 UTC m=+0.081270137 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:45:54 localhost podman[251051]: 2025-10-14 09:45:54.75456641 +0000 UTC m=+0.110871829 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm) Oct 14 05:45:54 localhost podman[251051]: unhealthy Oct 14 05:45:55 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:45:55 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:45:55 localhost nova_compute[238069]: 2025-10-14 09:45:55.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:55 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:45:55 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:45:56 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:56 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:56 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:45:56 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:45:56 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:45:57 localhost nova_compute[238069]: 2025-10-14 09:45:57.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:45:57 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:45:57 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:45:57.752 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:45:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:45:57.753 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:45:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:45:57.754 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:45:57 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:45:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48644 DF PROTO=TCP SPT=40528 DPT=9882 SEQ=3297151167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F37EB0000000001030307) Oct 14 05:45:58 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:45:58 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:45:58 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:45:58 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:45:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:45:58 localhost podman[251069]: 2025-10-14 09:45:58.961061392 +0000 UTC m=+0.083406592 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 05:45:58 localhost podman[251069]: 2025-10-14 09:45:58.993042636 +0000 UTC m=+0.115387836 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:46:00 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:00 localhost nova_compute[238069]: 2025-10-14 09:46:00.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:00 localhost systemd[1]: var-lib-containers-storage-overlay-705239d69edbb97c498d74570c74cd8434e37024eb25add7e89f19b22fc90898-merged.mount: Deactivated successfully. Oct 14 05:46:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36543 DF PROTO=TCP SPT=42744 DPT=9102 SEQ=182524126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F40DA0000000001030307) Oct 14 05:46:00 localhost systemd[1]: var-lib-containers-storage-overlay-705239d69edbb97c498d74570c74cd8434e37024eb25add7e89f19b22fc90898-merged.mount: Deactivated successfully. Oct 14 05:46:01 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:46:01 localhost systemd[1]: var-lib-containers-storage-overlay-4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df-merged.mount: Deactivated successfully. Oct 14 05:46:01 localhost systemd[1]: session-57.scope: Deactivated successfully. Oct 14 05:46:01 localhost systemd[1]: session-57.scope: Consumed 57.174s CPU time. Oct 14 05:46:01 localhost systemd-logind[760]: Session 57 logged out. Waiting for processes to exit. Oct 14 05:46:01 localhost systemd-logind[760]: Removed session 57. Oct 14 05:46:01 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:01 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:01 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:46:02 localhost nova_compute[238069]: 2025-10-14 09:46:02.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:46:04 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:04 localhost podman[251087]: 2025-10-14 09:46:04.106382294 +0000 UTC m=+0.050726707 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:46:04 localhost podman[251087]: 2025-10-14 09:46:04.11149656 +0000 UTC m=+0.055841013 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:46:04 localhost podman[251087]: unhealthy Oct 14 05:46:04 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:04 localhost podman[248187]: time="2025-10-14T09:46:04Z" level=error msg="Getting root fs size for \"6c754c652c411ed47e35a6e105f0d60abb4aaec91c448c734ac7e917f9367261\": getting diffsize of layer \"919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94\" and its parent \"948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca\": unmounting layer 919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94: replacing mount point \"/var/lib/containers/storage/overlay/919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94/merged\": device or resource busy" Oct 14 05:46:04 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:04 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:46:04 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:46:04 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48647 DF PROTO=TCP SPT=40528 DPT=9882 SEQ=3297151167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F539A0000000001030307) Oct 14 05:46:05 localhost nova_compute[238069]: 2025-10-14 09:46:05.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:06 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:06 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:06 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:07 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:07 localhost systemd[1]: var-lib-containers-storage-overlay-5f427123442eee41b986e1fae04c3a7af8dd46dc41159db5ec6876f25e2a2dde-merged.mount: Deactivated successfully. Oct 14 05:46:07 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:07 localhost nova_compute[238069]: 2025-10-14 09:46:07.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:07 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:07 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:07 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:46:07 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:08 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:08 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:08 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:08 localhost podman[248187]: time="2025-10-14T09:46:08Z" level=error msg="Getting root fs size for \"6da84e96fa92b609f00406101e268dd1331c0e896b503ade54cf93e0e5d81b9c\": getting diffsize of layer \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\" and its parent \"e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df\": unmounting layer e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df: replacing mount point \"/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged\": device or resource busy" Oct 14 05:46:08 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:08 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57165 DF PROTO=TCP SPT=35464 DPT=9105 SEQ=3386688231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F60CE0000000001030307) Oct 14 05:46:09 localhost systemd[1]: var-lib-containers-storage-overlay-edfe04f9a55df360a7193df528cae2e7de5655e253d67fa05d14b0067469a682-merged.mount: Deactivated successfully. Oct 14 05:46:09 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:46:09 localhost podman[251110]: 2025-10-14 09:46:09.732412571 +0000 UTC m=+0.075547842 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:46:09 localhost podman[251110]: 2025-10-14 09:46:09.768093238 +0000 UTC m=+0.111228469 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:46:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57166 DF PROTO=TCP SPT=35464 DPT=9105 SEQ=3386688231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F64DA0000000001030307) Oct 14 05:46:10 localhost nova_compute[238069]: 2025-10-14 09:46:10.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:11 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:11 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:11 localhost systemd[1]: var-lib-containers-storage-overlay-082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287-merged.mount: Deactivated successfully. Oct 14 05:46:11 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:46:11 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:46:11 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57167 DF PROTO=TCP SPT=35464 DPT=9105 SEQ=3386688231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F6CDA0000000001030307) Oct 14 05:46:12 localhost systemd[1]: var-lib-containers-storage-overlay-082ddc67d34506bb0770407c13ba7bb6b21421c42c2e601c85c5744cdf0ac287-merged.mount: Deactivated successfully. Oct 14 05:46:12 localhost nova_compute[238069]: 2025-10-14 09:46:12.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:46:12 localhost podman[251127]: 2025-10-14 09:46:12.74863243 +0000 UTC m=+0.076424239 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:46:12 localhost podman[251127]: 2025-10-14 09:46:12.760221843 +0000 UTC m=+0.088013642 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:46:13 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:13 localhost podman[248187]: time="2025-10-14T09:46:13Z" level=error msg="Getting root fs size for \"7297840f7ec543f88004134368565122cac8a0653a039601f9270ac72f8a5884\": getting diffsize of layer \"948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca\" and its parent \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\": unmounting layer 948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca: replacing mount point \"/var/lib/containers/storage/overlay/948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca/merged\": device or resource busy" Oct 14 05:46:14 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:14 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:14 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:46:14 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:14 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:14 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:15 localhost nova_compute[238069]: 2025-10-14 09:46:15.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:46:15 localhost systemd[1]: tmp-crun.XYrEMp.mount: Deactivated successfully. Oct 14 05:46:15 localhost podman[251150]: 2025-10-14 09:46:15.734542286 +0000 UTC m=+0.073032376 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:46:15 localhost podman[251150]: 2025-10-14 09:46:15.745084657 +0000 UTC m=+0.083574747 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0) Oct 14 05:46:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57168 DF PROTO=TCP SPT=35464 DPT=9105 SEQ=3386688231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F7C9A0000000001030307) Oct 14 05:46:16 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:16 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:16 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:46:16 localhost systemd[1]: var-lib-containers-storage-overlay-0f85b4a61f0484c7d6d1230e2bc736bd8398f5346eb8306d97c0ce215dfc5ab2-merged.mount: Deactivated successfully. Oct 14 05:46:16 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:46:17 localhost nova_compute[238069]: 2025-10-14 09:46:17.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:46:17 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:46:17 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:46:17 localhost podman[251166]: 2025-10-14 09:46:17.835747769 +0000 UTC m=+0.080778992 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal) Oct 14 05:46:17 localhost podman[251166]: 2025-10-14 09:46:17.847062704 +0000 UTC m=+0.092093927 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible) Oct 14 05:46:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35416 DF PROTO=TCP SPT=51612 DPT=9101 SEQ=4166686289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F855B0000000001030307) Oct 14 05:46:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:18 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:46:18 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:18 localhost podman[248187]: time="2025-10-14T09:46:18Z" level=error msg="Getting root fs size for \"72e96aa9b3b4eabf50086dc66fce56a2ef659f8c2feec7b5024978482d4db6fb\": unmounting layer e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df: replacing mount point \"/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged\": device or resource busy" Oct 14 05:46:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:19 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:46:19 localhost podman[251185]: 2025-10-14 09:46:19.249242801 +0000 UTC m=+0.091625952 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:46:19 localhost podman[251185]: 2025-10-14 09:46:19.317390097 +0000 UTC m=+0.159773268 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Oct 14 05:46:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:19 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:46:19 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:46:19 localhost systemd[1]: var-lib-containers-storage-overlay-7318240cee041ece4858dfb7aae8a7f672f3a389f0d73c3ccdde9a227d0af2bb-merged.mount: Deactivated successfully. Oct 14 05:46:19 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:20 localhost nova_compute[238069]: 2025-10-14 09:46:20.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:21 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:46:22 localhost systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully. Oct 14 05:46:22 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:22 localhost systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully. Oct 14 05:46:22 localhost nova_compute[238069]: 2025-10-14 09:46:22.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:22 localhost systemd[1]: var-lib-containers-storage-overlay-a6c55f01d04ba30e2a7871446cca9a16a6473cee6ca9a477923fc0329ccb78a4-merged.mount: Deactivated successfully. Oct 14 05:46:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61617 DF PROTO=TCP SPT=45110 DPT=9102 SEQ=3235867548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F9A550000000001030307) Oct 14 05:46:23 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:24 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:46:24 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:46:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61618 DF PROTO=TCP SPT=45110 DPT=9102 SEQ=3235867548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0F9E5A0000000001030307) Oct 14 05:46:24 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:46:25 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:25 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:25 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:46:25 localhost nova_compute[238069]: 2025-10-14 09:46:25.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:25 localhost podman[251210]: 2025-10-14 09:46:25.562424751 +0000 UTC m=+0.072045485 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 05:46:25 localhost podman[251210]: 2025-10-14 09:46:25.598326515 +0000 UTC m=+0.107947259 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Oct 14 05:46:25 localhost podman[251210]: unhealthy Oct 14 05:46:26 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:26 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:26 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:26 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:46:26 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:46:26 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:46:27 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:27 localhost nova_compute[238069]: 2025-10-14 09:46:27.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:27 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:46:28 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:46:28 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:46:28 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:46:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35740 DF PROTO=TCP SPT=60756 DPT=9882 SEQ=447092872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FAD1B0000000001030307) Oct 14 05:46:29 localhost systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully. Oct 14 05:46:29 localhost systemd[1]: var-lib-containers-storage-overlay-18fed1bc5c055eece5466d40a513df73328df93a77e2aad253cd120d7b08bd42-merged.mount: Deactivated successfully. Oct 14 05:46:29 localhost systemd[1]: var-lib-containers-storage-overlay-18fed1bc5c055eece5466d40a513df73328df93a77e2aad253cd120d7b08bd42-merged.mount: Deactivated successfully. Oct 14 05:46:30 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:46:30 localhost systemd[1]: var-lib-containers-storage-overlay-4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df-merged.mount: Deactivated successfully. Oct 14 05:46:30 localhost nova_compute[238069]: 2025-10-14 09:46:30.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:30 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:30 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:46:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61620 DF PROTO=TCP SPT=45110 DPT=9102 SEQ=3235867548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FB61B0000000001030307) Oct 14 05:46:31 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:46:31 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:31 localhost podman[251228]: 2025-10-14 09:46:31.501784243 +0000 UTC m=+0.092458367 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:46:31 localhost podman[251228]: 2025-10-14 09:46:31.511956854 +0000 UTC m=+0.102630918 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0) Oct 14 05:46:31 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:32 localhost nova_compute[238069]: 2025-10-14 09:46:32.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:32 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:33 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:33 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:33 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:33 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:33 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:46:34 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:46:34 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:35 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:35 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35743 DF PROTO=TCP SPT=60756 DPT=9882 SEQ=447092872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FC8DA0000000001030307) Oct 14 05:46:35 localhost podman[251296]: 2025-10-14 09:46:35.491534542 +0000 UTC m=+0.832977838 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:46:35 localhost nova_compute[238069]: 2025-10-14 09:46:35.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:35 localhost podman[251296]: 2025-10-14 09:46:35.530079376 +0000 UTC m=+0.871522702 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:46:35 localhost podman[251296]: unhealthy Oct 14 05:46:36 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:46:36 localhost systemd[1]: var-lib-containers-storage-overlay-279af57fe640a81799041eadaf076a38dc293fb9fa2d8ceac0fa223bb06cffab-merged.mount: Deactivated successfully. Oct 14 05:46:36 localhost systemd[1]: var-lib-containers-storage-overlay-279af57fe640a81799041eadaf076a38dc293fb9fa2d8ceac0fa223bb06cffab-merged.mount: Deactivated successfully. Oct 14 05:46:36 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:46:36 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:46:37 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:37 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:37 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:37 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:37 localhost nova_compute[238069]: 2025-10-14 09:46:37.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:38 localhost systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully. Oct 14 05:46:38 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:38 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61586 DF PROTO=TCP SPT=57364 DPT=9105 SEQ=717485439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FD5FE0000000001030307) Oct 14 05:46:39 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:46:39 localhost systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully. Oct 14 05:46:39 localhost systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully. Oct 14 05:46:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61587 DF PROTO=TCP SPT=57364 DPT=9105 SEQ=717485439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FDA1A0000000001030307) Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.041 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.041 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.041 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.053 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:40 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:46:40 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:46:40 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:46:40 localhost nova_compute[238069]: 2025-10-14 09:46:40.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-5f427123442eee41b986e1fae04c3a7af8dd46dc41159db5ec6876f25e2a2dde-merged.mount: Deactivated successfully. Oct 14 05:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:46:41 localhost podman[251413]: 2025-10-14 09:46:41.871131996 +0000 UTC m=+0.086137935 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 05:46:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61588 DF PROTO=TCP SPT=57364 DPT=9105 SEQ=717485439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FE21A0000000001030307) Oct 14 05:46:41 localhost podman[251413]: 2025-10-14 09:46:41.885108202 +0000 UTC m=+0.100114171 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 05:46:42 localhost systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully. Oct 14 05:46:42 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:42 localhost nova_compute[238069]: 2025-10-14 09:46:42.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:42 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:42 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:46:43 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:43 localhost nova_compute[238069]: 2025-10-14 09:46:43.061 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:43 localhost nova_compute[238069]: 2025-10-14 09:46:43.061 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:43 localhost nova_compute[238069]: 2025-10-14 09:46:43.061 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:43 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:46:43 localhost systemd[1]: var-lib-containers-storage-overlay-edfe04f9a55df360a7193df528cae2e7de5655e253d67fa05d14b0067469a682-merged.mount: Deactivated successfully. Oct 14 05:46:43 localhost systemd[1]: var-lib-containers-storage-overlay-edfe04f9a55df360a7193df528cae2e7de5655e253d67fa05d14b0067469a682-merged.mount: Deactivated successfully. Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.019 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.487 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.487 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.488 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.488 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:46:44 localhost podman[251431]: 2025-10-14 09:46:44.70793835 +0000 UTC m=+0.055791581 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:46:44 localhost podman[251431]: 2025-10-14 09:46:44.711710955 +0000 UTC m=+0.059564126 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.836 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.848 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.849 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.849 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.849 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.862 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.862 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.863 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.863 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:46:44 localhost nova_compute[238069]: 2025-10-14 09:46:44.863 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:46:45 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.264 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:46:45 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.323 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.324 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.484 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.486 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12233MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.486 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.487 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:45 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:45 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.808 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.808 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:46:45 localhost nova_compute[238069]: 2025-10-14 09:46:45.808 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:46:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61589 DF PROTO=TCP SPT=57364 DPT=9105 SEQ=717485439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FF1DA0000000001030307) Oct 14 05:46:46 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.083 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.100 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.100 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.119 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.140 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_ACCELERATORS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 05:46:46 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.297 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.688 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.697 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.713 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.715 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:46:46 localhost nova_compute[238069]: 2025-10-14 09:46:46.715 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:46:47 localhost nova_compute[238069]: 2025-10-14 09:46:47.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:46:47 localhost systemd[1]: tmp-crun.h9ifsD.mount: Deactivated successfully. Oct 14 05:46:47 localhost podman[251499]: 2025-10-14 09:46:47.721544599 +0000 UTC m=+0.070715835 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:46:47 localhost podman[251499]: 2025-10-14 09:46:47.7310852 +0000 UTC m=+0.080256426 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:46:47 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:47 localhost nova_compute[238069]: 2025-10-14 09:46:47.890 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:47 localhost nova_compute[238069]: 2025-10-14 09:46:47.891 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:46:47 localhost nova_compute[238069]: 2025-10-14 09:46:47.891 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:46:48 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:46:48 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:48 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:48 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:46:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61057 DF PROTO=TCP SPT=48220 DPT=9101 SEQ=4020743946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B0FFA9B0000000001030307) Oct 14 05:46:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:46:48 localhost podman[251519]: 2025-10-14 09:46:48.722568406 +0000 UTC m=+0.062265199 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350) Oct 14 05:46:48 localhost podman[251519]: 2025-10-14 09:46:48.739056398 +0000 UTC m=+0.078753221 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 05:46:49 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:49 localhost podman[248187]: time="2025-10-14T09:46:49Z" level=error msg="Getting root fs size for \"99bc33bcb09deb13e3026a7596b5a2cca2bae64aca6ed031d39492bb093d9aa2\": getting diffsize of layer \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": unmounting layer d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610: replacing mount point \"/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged\": device or resource busy" Oct 14 05:46:49 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:49 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:49 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:46:49 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:46:49 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:49 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:49 localhost podman[251540]: 2025-10-14 09:46:49.81906078 +0000 UTC m=+0.089736975 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.818 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.819 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.825 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fe9a420-1f1d-46c8-903c-efd546f8f40f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.819813', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4d2a38e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': 'fbd7db979b7d024f13d9aaed00219cd443d1817d52ba7017c3d9cb8cd26f0ab4'}]}, 'timestamp': '2025-10-14 09:46:49.826610', '_unique_id': '67517ca06ba946f98ed19f80d37b2808'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.901 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost podman[251540]: 2025-10-14 09:46:49.90175655 +0000 UTC m=+0.172432755 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.901 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3522f96-eae9-45ab-b1e7-91e0e94ed3ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.828869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4de2484-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': 'aa48df8f56e7205f48273c1f1434125704b20fa1f1166c71e9f0bc92809e1f20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.828869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4de30c8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '7c003447ed8c04be539f0eadd08686ac0db745789c0107f8af10d17f486cd24a'}]}, 'timestamp': '2025-10-14 09:46:49.902214', '_unique_id': 'e660184be9b14b16a7ec03f326f147be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56b9264a-6bc9-4e80-acd9-e82f2d061ffa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.904061', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4de8492-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '7ff70aec9b5bf77e2469e9a645d57aabed9ec81563e487eeaf5f28ae81259939'}]}, 'timestamp': '2025-10-14 09:46:49.904380', '_unique_id': 'a55f7d1bdf98428ea89730acafeb8cd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.905 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.905 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8248620f-05ce-4cfa-b695-9b886665bf10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.905471', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4deb8d6-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '11df98c232e2e474096d4d2423f80f7456c006ef36300908ac25ce11601b93bd'}]}, 'timestamp': '2025-10-14 09:46:49.905738', '_unique_id': '38805903034d4799820acccd4d13d4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '627627e8-acab-4f0d-aed8-79f5b09be8fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.906991', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4def422-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '112bbfa9ebbbd9f74ef55853271e0d8560b92052d62649ee38d865efe1616d1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.906991', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4defb98-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '2ef1a45e2d4e105544cd231b101f9e005eeabb26d67d34c4f836e363ba3decc4'}]}, 'timestamp': '2025-10-14 09:46:49.907429', '_unique_id': '5e3b3770cee740019e3c05088d8e8844'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.908 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8efd9b5d-a26b-442c-a3bd-9482cfb8ac80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.908655', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4df35cc-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': 'eb92027abe7bfb886c1243eddce62ef19eae32852b211b003e8c27fc3d768f36'}]}, 'timestamp': '2025-10-14 09:46:49.908892', '_unique_id': 'd05dc1da8aa942399930ac995681757d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98483e7e-8a4c-4bc7-82fc-037a7694142e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.910003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4df6984-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '1d066582a3ed34d41e3519e3ac509f63a065438644147e9c7a6c903f89d23683'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.910003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4df726c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '449581d2322e55b1525f4f01efa7032ea30dae41feb235b71d547cabab553f5e'}]}, 'timestamp': '2025-10-14 09:46:49.910429', '_unique_id': '232b23050540414b98a249c7d193eade'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eacf1c26-add7-4697-964d-96eaa8e9169b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.911532', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4dfa55c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '5b0dcd862df40c51c1a63dd29bd4e58984381c8c9004dd3da612b13ccc213a3a'}]}, 'timestamp': '2025-10-14 09:46:49.911763', '_unique_id': '3249dd86bc1f4c899a3229c72f37f20c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1867082-ba30-40d5-9ee2-df6644d9e615', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.912948', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4dfdcca-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': 'f3ef907d1f9a2ad1fa61a32e4838e2954a47642158afaa4b35a4ce4234dca0e1'}]}, 'timestamp': '2025-10-14 09:46:49.913172', '_unique_id': '03459bd5b0b4491aa53fad2d9dc1f8fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 61260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e9fa25b-edb6-4778-a1b8-2493b80e8a4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 61260000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:46:49.914237', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b4e27566-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.122254763, 'message_signature': '4d7c7d78323bea0b5376b2afd41080e9b60cf9b47735620ce2de4fd2e96994ad'}]}, 'timestamp': '2025-10-14 09:46:49.930334', '_unique_id': 'd2aff814adbb43e49a530c4c372ff192'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.931 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '760f19f8-3e72-4e34-92f5-96ba1bc76d9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.931766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4e2bcb0-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '2cd56b53dbaec89d3006c65f1b974b9cf760475ac20a8f185c3145fabd968280'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.931766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4e2c4a8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': 'fd7e58a51dd2583349bd6ee7579eeff42414640d2e7cc43e462e7ef97f7a780e'}]}, 'timestamp': '2025-10-14 09:46:49.932209', '_unique_id': '065b4c1f3b0447cd8a5994fc8e1ac987'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.945 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b42fe987-705b-42c6-ac20-adf11736c6c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.933417', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4e4e0f8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.1261194, 'message_signature': '764acf9103275f7071dc8bf8a6d6d2b2a0b66bb2c4b3606c4658fbad61295129'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.933417', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4e4eba2-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.1261194, 'message_signature': 'd129e1e38ed942884fa5ffca6fc8e0d70b32e19544ccf54e493dde045aac9b03'}]}, 'timestamp': '2025-10-14 09:46:49.946308', '_unique_id': 'f94ba414a3fb4f309dda416384958846'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.947 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd996af2f-dcb7-45c2-a140-39f754ba1d0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.947619', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4e527fc-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '2e5e7c02b52d745dd5d02dd5cbf17ebaba8b254405001245039307322bd6ce3c'}]}, 'timestamp': '2025-10-14 09:46:49.947864', '_unique_id': 'b700d751c45a4cbf8c7d7bcf957ce802'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.948 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5de6cff2-48f9-4b9c-a1db-49faf67c4cea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.948901', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4e55916-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '3f48559422afacfc091b991c37b55956d2f7bad605168debd84e01ed9df8cad7'}]}, 'timestamp': '2025-10-14 09:46:49.949118', '_unique_id': '8852bd6813744a028479560703e48732'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2837a1cc-a8cf-474c-b9be-d6c96fef7516', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.950179', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4e58ada-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '37c8e14347ed7a4791bc6f35f25d770b04f51c72c92dea5ccd4d571e1172f320'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.950179', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4e59390-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': 'ba9e6c7ce7ea9856085178fed5d268cb6599c5b9cc755ad891303545c6599f27'}]}, 'timestamp': '2025-10-14 09:46:49.950598', '_unique_id': '40b4ccd3cded45209ffa358eea901e51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f28cb6b8-cf2c-4c73-922e-8322e40d01db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.951704', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4e5c694-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.1261194, 'message_signature': '82c5fba76dd8e8e0d75f5b0a808f233e88a8721b043b66338d9d0afabacc8aff'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.951704', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4e5cdba-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.1261194, 'message_signature': 'eef7d45d539da1fe63d8ddfbfeee93c7ee6b726e72f5b27ac8581e895f4b82d1'}]}, 'timestamp': '2025-10-14 09:46:49.952085', '_unique_id': 'd0b3b169621e4326a313116472068ebb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fa0d1f1-c4eb-453e-9019-0c588aeffc3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.953105', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4e5fd4e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '2c3e610a96406b39b0cd5a8be0bf9003bd35c182032dde740b9704e4759df6c6'}]}, 'timestamp': '2025-10-14 09:46:49.953318', '_unique_id': '12f131016e2c4176a38fa82c9eb5d8ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.953 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5c3bbf5-0312-4229-bd35-bb188e0eeefa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:46:49.954313', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b4e62c60-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.122254763, 'message_signature': 'd987f4fa1fc3436d642a622fbcf2c195965faa71a608be5f9770749154dc0293'}]}, 'timestamp': '2025-10-14 09:46:49.954514', '_unique_id': '5bd95e6a1f4147f79ea6f8a182b56a4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.955 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.955 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cd8ef73-f7e5-49c4-8a0b-d779a5dbc374', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.955503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4e65ac8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.1261194, 'message_signature': '6a9b2aba66dba94acb976c160e7810ef9ff325e13901c49e6e5cd7367c8dc06e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.955503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4e662fc-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.1261194, 'message_signature': '26b91147806e7e99a8e0c68b2ecc4dcadb82ba017ed375df27d1cc7292b53262'}]}, 'timestamp': '2025-10-14 09:46:49.955907', '_unique_id': '5d69ff40f13a4affb530a22654d9ec1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.956 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0e634a6-020a-471a-9395-9d9884249395', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8696, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:46:49.956949', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'b4e69376-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.012495608, 'message_signature': '38050333e8c5db903324057ceed495ad9eb8bef83853719216ba2567bab22ef8'}]}, 'timestamp': '2025-10-14 09:46:49.957162', '_unique_id': 'e344f4f9f20047598f437331848fa78d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.958 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.958 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88409ec8-e95f-4019-8930-2a63d2ee61d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:46:49.958202', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4e6c45e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '339d8e0d8742a810dcf4ff3ac570d329a4bb7bb5b1daa601d88753fd9857b4b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:46:49.958202', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4e6cbac-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11226.021560465, 'message_signature': '3b4ad88b864f55967a6979c51f9153f6d5e1cc5e872d0520e07838e0fc15a069'}]}, 'timestamp': '2025-10-14 09:46:49.958586', '_unique_id': '23009e2630b94d2f98fa8946ec8202f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:46:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:46:49.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:46:50 localhost nova_compute[238069]: 2025-10-14 09:46:50.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:51 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:46:51 localhost systemd[1]: var-lib-containers-storage-overlay-55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754-merged.mount: Deactivated successfully. Oct 14 05:46:51 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:51 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:51 localhost systemd[1]: var-lib-containers-storage-overlay-55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754-merged.mount: Deactivated successfully. Oct 14 05:46:51 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:46:51 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Oct 14 05:46:52 localhost nova_compute[238069]: 2025-10-14 09:46:52.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:52 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Oct 14 05:46:53 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46680 DF PROTO=TCP SPT=60252 DPT=9102 SEQ=2310199596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B100F850000000001030307) Oct 14 05:46:53 localhost systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully. Oct 14 05:46:53 localhost systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully. Oct 14 05:46:54 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:46:54 localhost systemd[1]: var-lib-containers-storage-overlay-0f85b4a61f0484c7d6d1230e2bc736bd8398f5346eb8306d97c0ce215dfc5ab2-merged.mount: Deactivated successfully. Oct 14 05:46:54 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46681 DF PROTO=TCP SPT=60252 DPT=9102 SEQ=2310199596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10139A0000000001030307) Oct 14 05:46:54 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:46:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:55 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:55 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:46:55 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:55 localhost podman[248187]: time="2025-10-14T09:46:55Z" level=error msg="Getting root fs size for \"acd1c4ab6c1d8932ceea5284c5afe13514a697ecaf4356ac8c447e4780adbed8\": getting diffsize of layer \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\" and its parent \"e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df\": unmounting layer 19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8: replacing mount point \"/var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/merged\": device or resource busy" Oct 14 05:46:55 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:55 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:46:55 localhost nova_compute[238069]: 2025-10-14 09:46:55.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:55 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:46:56 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:46:56 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:56 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:46:56 localhost podman[251563]: 2025-10-14 09:46:56.728711673 +0000 UTC m=+0.074199892 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:46:56 localhost podman[251563]: 2025-10-14 09:46:56.761048187 +0000 UTC m=+0.106536326 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:46:56 localhost podman[251563]: unhealthy Oct 14 05:46:56 localhost systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully. Oct 14 05:46:56 localhost systemd[1]: var-lib-containers-storage-overlay-02ae85124e4959ea5e505d3d23ffa956e944453d3a9644fd9b15fb0c07f7fbc0-merged.mount: Deactivated successfully. Oct 14 05:46:57 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:46:57 localhost systemd[1]: var-lib-containers-storage-overlay-7318240cee041ece4858dfb7aae8a7f672f3a389f0d73c3ccdde9a227d0af2bb-merged.mount: Deactivated successfully. Oct 14 05:46:57 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:46:57 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:46:57 localhost nova_compute[238069]: 2025-10-14 09:46:57.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:46:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:46:57.753 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:46:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:46:57.754 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:46:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:46:57.755 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:46:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57519 DF PROTO=TCP SPT=34940 DPT=9882 SEQ=3358641641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10224B0000000001030307) Oct 14 05:46:58 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:46:58 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:46:58 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:46:58 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:46:59 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:46:59 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:47:00 localhost systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully. Oct 14 05:47:00 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:47:00 localhost systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully. Oct 14 05:47:00 localhost nova_compute[238069]: 2025-10-14 09:47:00.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46683 DF PROTO=TCP SPT=60252 DPT=9102 SEQ=2310199596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B102B5A0000000001030307) Oct 14 05:47:02 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:02 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:47:02 localhost systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully. Oct 14 05:47:02 localhost nova_compute[238069]: 2025-10-14 09:47:02.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:03 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:47:03 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:47:03 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:03 localhost podman[251579]: 2025-10-14 09:47:03.632432524 +0000 UTC m=+0.097554073 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:47:03 localhost podman[251579]: 2025-10-14 09:47:03.662620844 +0000 UTC m=+0.127742393 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:47:04 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:04 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:47:04 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:47:04 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:47:04 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:47:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57522 DF PROTO=TCP SPT=34940 DPT=9882 SEQ=3358641641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B103E1A0000000001030307) Oct 14 05:47:05 localhost nova_compute[238069]: 2025-10-14 09:47:05.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:05 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:05 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:05 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:47:06 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:06 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:47:06 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:06 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:47:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:47:06 localhost podman[251597]: 2025-10-14 09:47:06.768492613 +0000 UTC m=+0.107414253 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:47:06 localhost podman[251597]: 2025-10-14 09:47:06.810307017 +0000 UTC m=+0.149228727 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:47:06 localhost podman[251597]: unhealthy Oct 14 05:47:07 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:47:07 localhost nova_compute[238069]: 2025-10-14 09:47:07.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:08 localhost systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully. Oct 14 05:47:08 localhost systemd[1]: var-lib-containers-storage-overlay-18fed1bc5c055eece5466d40a513df73328df93a77e2aad253cd120d7b08bd42-merged.mount: Deactivated successfully. Oct 14 05:47:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26993 DF PROTO=TCP SPT=45194 DPT=9105 SEQ=3877322304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B104B2E0000000001030307) Oct 14 05:47:08 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:47:08 localhost systemd[1]: var-lib-containers-storage-overlay-18fed1bc5c055eece5466d40a513df73328df93a77e2aad253cd120d7b08bd42-merged.mount: Deactivated successfully. Oct 14 05:47:08 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:47:08 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:47:09 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:09 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:47:09 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:47:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26994 DF PROTO=TCP SPT=45194 DPT=9105 SEQ=3877322304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B104F1A0000000001030307) Oct 14 05:47:10 localhost nova_compute[238069]: 2025-10-14 09:47:10.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:10 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:10 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:47:11 localhost systemd[1]: var-lib-containers-storage-overlay-279af57fe640a81799041eadaf076a38dc293fb9fa2d8ceac0fa223bb06cffab-merged.mount: Deactivated successfully. Oct 14 05:47:11 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:11 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:11 localhost systemd[1]: var-lib-containers-storage-overlay-279af57fe640a81799041eadaf076a38dc293fb9fa2d8ceac0fa223bb06cffab-merged.mount: Deactivated successfully. Oct 14 05:47:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26995 DF PROTO=TCP SPT=45194 DPT=9105 SEQ=3877322304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10571A0000000001030307) Oct 14 05:47:12 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:47:12 localhost systemd[1]: tmp-crun.jZ2O43.mount: Deactivated successfully. Oct 14 05:47:12 localhost nova_compute[238069]: 2025-10-14 09:47:12.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:12 localhost podman[251620]: 2025-10-14 09:47:12.669745715 +0000 UTC m=+0.134659733 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 05:47:12 localhost podman[251620]: 2025-10-14 09:47:12.679312436 +0000 UTC m=+0.144226464 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:47:12 localhost systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully. Oct 14 05:47:13 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:13 localhost systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully. Oct 14 05:47:13 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:13 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:13 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:47:14 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:14 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:15 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:15 localhost nova_compute[238069]: 2025-10-14 09:47:15.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:15 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:15 localhost systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully. Oct 14 05:47:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26996 DF PROTO=TCP SPT=45194 DPT=9105 SEQ=3877322304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1066DA0000000001030307) Oct 14 05:47:16 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:47:16 localhost podman[251638]: 2025-10-14 09:47:16.155398996 +0000 UTC m=+0.085253058 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:47:16 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:16 localhost podman[251638]: 2025-10-14 09:47:16.18900611 +0000 UTC m=+0.118860072 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:47:16 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:47:16 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:47:17 localhost nova_compute[238069]: 2025-10-14 09:47:17.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2316 DF PROTO=TCP SPT=33982 DPT=9101 SEQ=1938115895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B106FDB0000000001030307) Oct 14 05:47:18 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:47:18 localhost systemd[1]: var-lib-containers-storage-overlay-d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb-merged.mount: Deactivated successfully. Oct 14 05:47:18 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:47:18 localhost podman[251661]: 2025-10-14 09:47:18.796764565 +0000 UTC m=+0.253604557 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:47:18 localhost podman[251661]: 2025-10-14 09:47:18.804655646 +0000 UTC m=+0.261495598 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:47:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:47:19 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:47:20 localhost nova_compute[238069]: 2025-10-14 09:47:20.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:21 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:21 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:47:21 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:47:21 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:47:21 localhost podman[251678]: 2025-10-14 09:47:21.400091975 +0000 UTC m=+1.984194499 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 05:47:21 localhost podman[251678]: 2025-10-14 09:47:21.415164325 +0000 UTC m=+1.999266899 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 05:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:47:22 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:47:22 localhost nova_compute[238069]: 2025-10-14 09:47:22.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:22 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:23 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:23 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:23 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:47:23 localhost podman[251698]: 2025-10-14 09:47:23.367439781 +0000 UTC m=+1.459172095 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 05:47:23 localhost podman[251698]: 2025-10-14 09:47:23.408207053 +0000 UTC m=+1.499939377 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 05:47:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23922 DF PROTO=TCP SPT=45610 DPT=9102 SEQ=3300775857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1084B50000000001030307) Oct 14 05:47:24 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:24 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:24 localhost systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully. Oct 14 05:47:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23923 DF PROTO=TCP SPT=45610 DPT=9102 SEQ=3300775857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1088DA0000000001030307) Oct 14 05:47:24 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:47:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:25 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:25 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:25 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:25 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:25 localhost nova_compute[238069]: 2025-10-14 09:47:25.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:27 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:27 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:47:27 localhost podman[251723]: 2025-10-14 09:47:27.684311215 +0000 UTC m=+0.053251584 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3) Oct 14 05:47:27 localhost podman[251723]: 2025-10-14 09:47:27.716118183 +0000 UTC m=+0.085058572 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009) Oct 14 05:47:27 localhost podman[251723]: unhealthy Oct 14 05:47:27 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:27 localhost nova_compute[238069]: 2025-10-14 09:47:27.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:28 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:47:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26548 DF PROTO=TCP SPT=38372 DPT=9882 SEQ=1326827804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10977B0000000001030307) Oct 14 05:47:28 localhost systemd[1]: var-lib-containers-storage-overlay-edfc0b06a4a796de9d2029c8d0ee2f6200965b91068a8e771289702817852d05-merged.mount: Deactivated successfully. Oct 14 05:47:28 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:47:28 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:47:29 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:29 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:47:29 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:47:30 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:30 localhost nova_compute[238069]: 2025-10-14 09:47:30.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:30 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23925 DF PROTO=TCP SPT=45610 DPT=9102 SEQ=3300775857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10A09A0000000001030307) Oct 14 05:47:31 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:31 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:31 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:47:31 localhost systemd[1]: var-lib-containers-storage-overlay-3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4-merged.mount: Deactivated successfully. Oct 14 05:47:31 localhost systemd[1]: var-lib-containers-storage-overlay-3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4-merged.mount: Deactivated successfully. Oct 14 05:47:32 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:32 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:32 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:47:32 localhost nova_compute[238069]: 2025-10-14 09:47:32.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:32 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:32 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:33 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:47:33 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:33 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:33 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:47:33 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:33 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:34 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:34 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:34 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:47:35 localhost podman[251740]: 2025-10-14 09:47:35.259806622 +0000 UTC m=+0.098244114 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009) Oct 14 05:47:35 localhost podman[251740]: 2025-10-14 09:47:35.269384294 +0000 UTC m=+0.107821786 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:47:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26551 DF PROTO=TCP SPT=38372 DPT=9882 SEQ=1326827804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10B35A0000000001030307) Oct 14 05:47:35 localhost nova_compute[238069]: 2025-10-14 09:47:35.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:35 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:47:35 localhost systemd[1]: var-lib-containers-storage-overlay-f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119-merged.mount: Deactivated successfully. Oct 14 05:47:35 localhost systemd[1]: var-lib-containers-storage-overlay-f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119-merged.mount: Deactivated successfully. Oct 14 05:47:36 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:36 localhost systemd[1]: var-lib-containers-storage-overlay-55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754-merged.mount: Deactivated successfully. Oct 14 05:47:36 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:36 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:36 localhost systemd[1]: var-lib-containers-storage-overlay-55b86a5ddc64363ea624c80cfd4e28da33e700e176dc8af3450feb34716db754-merged.mount: Deactivated successfully. Oct 14 05:47:36 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:47:37 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Oct 14 05:47:37 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Oct 14 05:47:37 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:47:37 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:47:37 localhost nova_compute[238069]: 2025-10-14 09:47:37.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:37 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:47:38 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:38 localhost systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully. Oct 14 05:47:38 localhost systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully. Oct 14 05:47:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63749 DF PROTO=TCP SPT=36412 DPT=9105 SEQ=3053841550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10C05F0000000001030307) Oct 14 05:47:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:47:39 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:47:39 localhost systemd[1]: tmp-crun.pJ5Pc0.mount: Deactivated successfully. Oct 14 05:47:39 localhost podman[251755]: 2025-10-14 09:47:39.388181264 +0000 UTC m=+0.120693429 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:47:39 localhost podman[251755]: 2025-10-14 09:47:39.424121638 +0000 UTC m=+0.156633833 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:47:39 localhost podman[251755]: unhealthy Oct 14 05:47:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63750 DF PROTO=TCP SPT=36412 DPT=9105 SEQ=3053841550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10C45A0000000001030307) Oct 14 05:47:40 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:47:40 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:47:40 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:47:40 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:40 localhost nova_compute[238069]: 2025-10-14 09:47:40.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:40 localhost podman[248187]: time="2025-10-14T09:47:40Z" level=error msg="Getting root fs size for \"d28a6621d713cf99fe1f4c64def479347c43741f4edc6c6184b2e3e7dfa3c11f\": getting diffsize of layer \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\" and its parent \"e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df\": unmounting layer e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df: replacing mount point \"/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged\": device or resource busy" Oct 14 05:47:40 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:40 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:40 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:40 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:41 localhost nova_compute[238069]: 2025-10-14 09:47:41.020 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:41 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:41 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:47:41 localhost systemd[1]: var-lib-containers-storage-overlay-93e7bd8fd2f06fab388e6a7e1d321b8be1a1e6e35c116f25c67dac1cc2084007-merged.mount: Deactivated successfully. Oct 14 05:47:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63751 DF PROTO=TCP SPT=36412 DPT=9105 SEQ=3053841550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10CC5A0000000001030307) Oct 14 05:47:42 localhost systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully. Oct 14 05:47:42 localhost systemd[1]: var-lib-containers-storage-overlay-02ae85124e4959ea5e505d3d23ffa956e944453d3a9644fd9b15fb0c07f7fbc0-merged.mount: Deactivated successfully. Oct 14 05:47:42 localhost systemd[1]: var-lib-containers-storage-overlay-02ae85124e4959ea5e505d3d23ffa956e944453d3a9644fd9b15fb0c07f7fbc0-merged.mount: Deactivated successfully. Oct 14 05:47:42 localhost nova_compute[238069]: 2025-10-14 09:47:42.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:42 localhost sshd[251863]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:47:42 localhost systemd-logind[760]: New session 58 of user zuul. Oct 14 05:47:42 localhost systemd[1]: Started Session 58 of User zuul. Oct 14 05:47:43 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:47:43 localhost systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully. Oct 14 05:47:43 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:47:43 localhost python3.9[251959]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman Oct 14 05:47:43 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:47:44 localhost nova_compute[238069]: 2025-10-14 09:47:44.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:44 localhost nova_compute[238069]: 2025-10-14 09:47:44.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:47:44 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:47:44 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:44 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:47:44 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:47:44 localhost podman[251973]: 2025-10-14 09:47:44.971915472 +0000 UTC m=+0.806013716 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:47:44 localhost podman[251973]: 2025-10-14 09:47:44.985118024 +0000 UTC m=+0.819216308 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:47:45 localhost nova_compute[238069]: 2025-10-14 09:47:45.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:45 localhost nova_compute[238069]: 2025-10-14 09:47:45.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:45 localhost nova_compute[238069]: 2025-10-14 09:47:45.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:45 localhost python3.9[252102]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:47:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63752 DF PROTO=TCP SPT=36412 DPT=9105 SEQ=3053841550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10DC1A0000000001030307) Oct 14 05:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.020 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:47:46 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:46 localhost systemd[1]: Started libpod-conmon-1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.scope. Oct 14 05:47:46 localhost podman[252103]: 2025-10-14 09:47:46.334755351 +0000 UTC m=+0.667459945 container exec 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:47:46 localhost podman[252103]: 2025-10-14 09:47:46.36459698 +0000 UTC m=+0.697301564 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.615 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.615 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.616 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:47:46 localhost nova_compute[238069]: 2025-10-14 09:47:46.616 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.004 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.025 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.026 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.026 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.026 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.027 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.047 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.047 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.047 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.048 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.048 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.512 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.601 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.601 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.778 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.779 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12224MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.779 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.780 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.862 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.862 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.862 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:47:47 localhost nova_compute[238069]: 2025-10-14 09:47:47.912 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:47:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18768 DF PROTO=TCP SPT=34492 DPT=9101 SEQ=58814580 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10E4DA0000000001030307) Oct 14 05:47:48 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:47:48 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:47:48 localhost nova_compute[238069]: 2025-10-14 09:47:48.336 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:47:48 localhost nova_compute[238069]: 2025-10-14 09:47:48.342 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:47:48 localhost nova_compute[238069]: 2025-10-14 09:47:48.364 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:47:48 localhost nova_compute[238069]: 2025-10-14 09:47:48.366 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:47:48 localhost nova_compute[238069]: 2025-10-14 09:47:48.366 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:47:48 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:47:49 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:47:49 localhost python3.9[252300]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:47:50 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:47:50 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:47:50 localhost systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully. Oct 14 05:47:50 localhost systemd[1]: libpod-conmon-1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.scope: Deactivated successfully. Oct 14 05:47:50 localhost podman[252196]: 2025-10-14 09:47:50.413053676 +0000 UTC m=+1.485680932 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:47:50 localhost podman[252196]: 2025-10-14 09:47:50.425863217 +0000 UTC m=+1.498490463 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:47:50 localhost systemd[1]: Started libpod-conmon-1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.scope. Oct 14 05:47:50 localhost podman[252301]: 2025-10-14 09:47:50.522864732 +0000 UTC m=+0.973115547 container exec 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2) Oct 14 05:47:50 localhost podman[252301]: 2025-10-14 09:47:50.552715361 +0000 UTC m=+1.002966156 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 05:47:50 localhost nova_compute[238069]: 2025-10-14 09:47:50.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:51 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:47:51 localhost systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully. Oct 14 05:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:47:51 localhost systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully. Oct 14 05:47:52 localhost systemd[1]: var-lib-containers-storage-overlay-215025152e7486dca6aa506e7e941c98eca167be4a4853b2a3771ef4f2b39afc-merged.mount: Deactivated successfully. Oct 14 05:47:52 localhost systemd[1]: var-lib-containers-storage-overlay-215025152e7486dca6aa506e7e941c98eca167be4a4853b2a3771ef4f2b39afc-merged.mount: Deactivated successfully. Oct 14 05:47:52 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:47:52 localhost podman[252337]: 2025-10-14 09:47:52.24706709 +0000 UTC m=+0.587862190 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 05:47:52 localhost podman[252337]: 2025-10-14 09:47:52.28779273 +0000 UTC m=+0.628587830 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:47:52 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:47:52 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:47:52 localhost systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully. Oct 14 05:47:52 localhost nova_compute[238069]: 2025-10-14 09:47:52.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:52 localhost python3.9[252469]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:47:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48334 DF PROTO=TCP SPT=55678 DPT=9102 SEQ=2541828246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10F9E50000000001030307) Oct 14 05:47:53 localhost python3.9[252579]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman Oct 14 05:47:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48335 DF PROTO=TCP SPT=55678 DPT=9102 SEQ=2541828246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B10FDDA0000000001030307) Oct 14 05:47:54 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:54 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:47:55 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:47:55 localhost systemd[1]: libpod-conmon-1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.scope: Deactivated successfully. Oct 14 05:47:55 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:47:55 localhost podman[252603]: 2025-10-14 09:47:55.238855272 +0000 UTC m=+0.465011682 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller) Oct 14 05:47:55 localhost podman[252580]: 2025-10-14 09:47:55.292906467 +0000 UTC m=+1.808469032 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm) Oct 14 05:47:55 localhost podman[252580]: 2025-10-14 09:47:55.314025117 +0000 UTC m=+1.829587662 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm) Oct 14 05:47:55 localhost podman[252603]: 2025-10-14 09:47:55.321929985 +0000 UTC m=+0.548086415 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller) Oct 14 05:47:55 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:47:55 localhost nova_compute[238069]: 2025-10-14 09:47:55.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:56 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:57 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:57 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:47:57 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:47:57 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:47:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:47:57.755 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:47:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:47:57.755 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:47:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:47:57.757 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:47:57 localhost nova_compute[238069]: 2025-10-14 09:47:57.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:47:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61262 DF PROTO=TCP SPT=38978 DPT=9882 SEQ=2248588387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B110CAB0000000001030307) Oct 14 05:47:58 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:47:58 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:47:58 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:47:59 localhost systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully. Oct 14 05:47:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:47:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:00 localhost podman[252639]: 2025-10-14 09:48:00.032446547 +0000 UTC m=+1.387220225 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 05:48:00 localhost podman[252639]: 2025-10-14 09:48:00.065054513 +0000 UTC m=+1.419828181 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:48:00 localhost podman[252639]: unhealthy Oct 14 05:48:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48337 DF PROTO=TCP SPT=55678 DPT=9102 SEQ=2541828246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11159A0000000001030307) Oct 14 05:48:00 localhost nova_compute[238069]: 2025-10-14 09:48:00.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:00 localhost python3.9[252765]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:00 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:00 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:02 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:02 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:02 localhost nova_compute[238069]: 2025-10-14 09:48:02.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:02 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:02 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:48:02 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:48:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:02 localhost systemd[1]: Started libpod-conmon-28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.scope. Oct 14 05:48:03 localhost podman[252766]: 2025-10-14 09:48:03.013729635 +0000 UTC m=+2.286081853 container exec 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent) Oct 14 05:48:03 localhost podman[252766]: 2025-10-14 09:48:03.049119536 +0000 UTC m=+2.321471754 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Oct 14 05:48:03 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:04 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:04 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:05 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:05 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:05 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61265 DF PROTO=TCP SPT=38978 DPT=9882 SEQ=2248588387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11285B0000000001030307) Oct 14 05:48:05 localhost auditd[722]: Audit daemon rotating log files Oct 14 05:48:05 localhost nova_compute[238069]: 2025-10-14 09:48:05.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:06 localhost python3.9[252904]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:06 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:06 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:06 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:48:07 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:07 localhost nova_compute[238069]: 2025-10-14 09:48:07.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:07 localhost systemd[1]: var-lib-containers-storage-overlay-b323c7b12cc075908cfc59295640d329582d5902bacc1e83223968c79062b1a1-merged.mount: Deactivated successfully. Oct 14 05:48:08 localhost systemd[1]: var-lib-containers-storage-overlay-b323c7b12cc075908cfc59295640d329582d5902bacc1e83223968c79062b1a1-merged.mount: Deactivated successfully. Oct 14 05:48:08 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:08 localhost systemd[1]: libpod-conmon-28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.scope: Deactivated successfully. Oct 14 05:48:08 localhost systemd[1]: Started libpod-conmon-28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.scope. Oct 14 05:48:08 localhost podman[252905]: 2025-10-14 09:48:08.115684771 +0000 UTC m=+2.066344044 container exec 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:48:08 localhost podman[252915]: 2025-10-14 09:48:08.204787377 +0000 UTC m=+0.535969468 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true) Oct 14 05:48:08 localhost podman[252932]: 2025-10-14 09:48:08.239096525 +0000 UTC m=+0.109509014 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:48:08 localhost podman[252905]: 2025-10-14 09:48:08.243891631 +0000 UTC m=+2.194550984 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:48:08 localhost podman[252915]: 2025-10-14 09:48:08.25014428 +0000 UTC m=+0.581326371 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:48:08 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32136 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=100674595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11358E0000000001030307) Oct 14 05:48:08 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:08 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:48:08 localhost systemd[1]: libpod-conmon-28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.scope: Deactivated successfully. Oct 14 05:48:09 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:48:09 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:09 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:09 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:48:09 localhost python3.9[253062]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:48:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32137 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=100674595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11399A0000000001030307) Oct 14 05:48:10 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:48:10 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:10 localhost podman[253134]: 2025-10-14 09:48:10.306049597 +0000 UTC m=+0.081634431 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:48:10 localhost podman[253134]: 2025-10-14 09:48:10.318877576 +0000 UTC m=+0.094462410 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:48:10 localhost podman[253134]: unhealthy Oct 14 05:48:10 localhost python3.9[253195]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman Oct 14 05:48:10 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:10 localhost nova_compute[238069]: 2025-10-14 09:48:10.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32138 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=100674595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11419A0000000001030307) Oct 14 05:48:12 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:12 localhost systemd[1]: var-lib-containers-storage-overlay-d651fdeeacd3d56019b1d31627c266ca6522613e8d6238e23e02ec783ccfc1eb-merged.mount: Deactivated successfully. Oct 14 05:48:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:12 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:48:12 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:48:12 localhost nova_compute[238069]: 2025-10-14 09:48:12.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:13 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:14 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:15 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:48:15 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:48:15 localhost nova_compute[238069]: 2025-10-14 09:48:15.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32139 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=100674595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11515A0000000001030307) Oct 14 05:48:16 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:48:16 localhost systemd[1]: var-lib-containers-storage-overlay-d1668dbabecea61a717977938a99d4a46ffa99afa4505047a6e5a86838675946-merged.mount: Deactivated successfully. Oct 14 05:48:16 localhost systemd[1]: var-lib-containers-storage-overlay-d1668dbabecea61a717977938a99d4a46ffa99afa4505047a6e5a86838675946-merged.mount: Deactivated successfully. Oct 14 05:48:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:48:17 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:17 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:17 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:17 localhost nova_compute[238069]: 2025-10-14 09:48:17.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:17 localhost podman[253210]: 2025-10-14 09:48:17.847349903 +0000 UTC m=+1.553566508 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:48:17 localhost podman[253210]: 2025-10-14 09:48:17.886154488 +0000 UTC m=+1.592371113 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:48:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54079 DF PROTO=TCP SPT=47504 DPT=9101 SEQ=381231375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B115A1A0000000001030307) Oct 14 05:48:18 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:48:18 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:48:18 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:19 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:19 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:19 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:48:19 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:19 localhost python3.9[253337]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:19 localhost systemd[1]: Started libpod-conmon-fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.scope. Oct 14 05:48:19 localhost podman[253339]: 2025-10-14 09:48:19.472571829 +0000 UTC m=+0.152583337 container exec fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS) Oct 14 05:48:19 localhost podman[253339]: 2025-10-14 09:48:19.507087125 +0000 UTC m=+0.187098643 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible) Oct 14 05:48:19 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:19 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:19 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:19 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:19 localhost podman[248187]: time="2025-10-14T09:48:19Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/merged: device or resource busy" Oct 14 05:48:19 localhost podman[248187]: time="2025-10-14T09:48:19Z" level=error msg="Getting root fs size for \"b97a8257ba2511805a046403dbba16e8733667e02309c88bf2fb88d0470c2f9b\": getting diffsize of layer \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\" and its parent \"e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df\": creating overlay mount to /var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/GCRKPK2XCK5PXZUBSEJGX3CA5D,upperdir=/var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/diff,workdir=/var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/work,nodev,metacopy=on\": no such file or directory" Oct 14 05:48:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:20 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:20 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:20 localhost nova_compute[238069]: 2025-10-14 09:48:20.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:21 localhost python3.9[253477]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:48:22 localhost systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully. Oct 14 05:48:22 localhost systemd[1]: var-lib-containers-storage-overlay-edfc0b06a4a796de9d2029c8d0ee2f6200965b91068a8e771289702817852d05-merged.mount: Deactivated successfully. Oct 14 05:48:22 localhost nova_compute[238069]: 2025-10-14 09:48:22.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:22 localhost systemd[1]: libpod-conmon-fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.scope: Deactivated successfully. Oct 14 05:48:23 localhost systemd[1]: Started libpod-conmon-fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.scope. Oct 14 05:48:23 localhost podman[253478]: 2025-10-14 09:48:23.061098402 +0000 UTC m=+1.587747933 container exec fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:48:23 localhost podman[253488]: 2025-10-14 09:48:23.097328828 +0000 UTC m=+0.434484198 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:48:23 localhost podman[253488]: 2025-10-14 09:48:23.134100271 +0000 UTC m=+0.471255621 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:48:23 localhost podman[253508]: 2025-10-14 09:48:23.19619533 +0000 UTC m=+0.121819887 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:48:23 localhost podman[253478]: 2025-10-14 09:48:23.202095659 +0000 UTC m=+1.728745200 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:48:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28319 DF PROTO=TCP SPT=51588 DPT=9102 SEQ=54880976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B116F150000000001030307) Oct 14 05:48:23 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:23 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:48:23 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:48:24 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:48:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:24 localhost systemd[1]: libpod-conmon-fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.scope: Deactivated successfully. Oct 14 05:48:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28320 DF PROTO=TCP SPT=51588 DPT=9102 SEQ=54880976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11731B0000000001030307) Oct 14 05:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-66141e0355e434a1428da5b2027ef6192344d1c6afa950636647476e8925671b-merged.mount: Deactivated successfully. Oct 14 05:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:24 localhost python3.9[253644]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:48:25 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:25 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:25 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:48:25 localhost podman[253754]: 2025-10-14 09:48:25.5939058 +0000 UTC m=+0.114000320 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd) Oct 14 05:48:25 localhost podman[253754]: 2025-10-14 09:48:25.609006737 +0000 UTC m=+0.129101257 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:48:25 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:48:25 localhost python3.9[253755]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman Oct 14 05:48:25 localhost nova_compute[238069]: 2025-10-14 09:48:25.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully. Oct 14 05:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-3f2142f0cf6c862446ab1f2dfb59e029b67fc4461f58078ef3c91fbbb00953f4-merged.mount: Deactivated successfully. Oct 14 05:48:26 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:27 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:27 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:27 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:27 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:48:27 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:27 localhost podman[253895]: 2025-10-14 09:48:27.73478753 +0000 UTC m=+0.084936742 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:48:27 localhost podman[253895]: 2025-10-14 09:48:27.786997239 +0000 UTC m=+0.137146451 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:48:27 localhost podman[253896]: 2025-10-14 09:48:27.806108197 +0000 UTC m=+0.152368591 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.) Oct 14 05:48:27 localhost podman[253896]: 2025-10-14 09:48:27.820980847 +0000 UTC m=+0.167241251 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public) Oct 14 05:48:27 localhost nova_compute[238069]: 2025-10-14 09:48:27.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:27 localhost python3.9[253897]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:28 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:28 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:48:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28657 DF PROTO=TCP SPT=51594 DPT=9882 SEQ=164272800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1181DA0000000001030307) Oct 14 05:48:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:28 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:48:28 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:48:28 localhost systemd[1]: var-lib-containers-storage-overlay-3c56646706fff247676980ac78d7924c31221bb364af528e25b8eedf875d177e-merged.mount: Deactivated successfully. Oct 14 05:48:28 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:48:28 localhost systemd[1]: Started libpod-conmon-02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.scope. Oct 14 05:48:28 localhost podman[253937]: 2025-10-14 09:48:28.723157485 +0000 UTC m=+0.842577335 container exec 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd) Oct 14 05:48:28 localhost podman[253937]: 2025-10-14 09:48:28.756057221 +0000 UTC m=+0.875477011 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:48:29 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:29 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28322 DF PROTO=TCP SPT=51588 DPT=9102 SEQ=54880976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B118ADA0000000001030307) Oct 14 05:48:30 localhost nova_compute[238069]: 2025-10-14 09:48:30.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:31 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:31 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:31 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Oct 14 05:48:32 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:32 localhost nova_compute[238069]: 2025-10-14 09:48:32.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:32 localhost python3.9[254075]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:33 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:48:33 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:34 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:34 localhost systemd[1]: libpod-conmon-02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.scope: Deactivated successfully. Oct 14 05:48:34 localhost systemd[1]: Started libpod-conmon-02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.scope. Oct 14 05:48:34 localhost podman[254076]: 2025-10-14 09:48:34.135868765 +0000 UTC m=+1.274138425 container exec 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible) Oct 14 05:48:34 localhost podman[254087]: 2025-10-14 09:48:34.156417536 +0000 UTC m=+0.561646166 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:48:34 localhost podman[254087]: 2025-10-14 09:48:34.198055726 +0000 UTC m=+0.603284356 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:48:34 localhost podman[254087]: unhealthy Oct 14 05:48:34 localhost podman[254076]: 2025-10-14 09:48:34.218580897 +0000 UTC m=+1.356850497 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, config_id=multipathd) Oct 14 05:48:34 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:48:35 localhost systemd[1]: var-lib-containers-storage-overlay-f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119-merged.mount: Deactivated successfully. Oct 14 05:48:35 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:35 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:48:35 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Failed with result 'exit-code'. Oct 14 05:48:35 localhost systemd[1]: libpod-conmon-02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.scope: Deactivated successfully. Oct 14 05:48:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28660 DF PROTO=TCP SPT=51594 DPT=9882 SEQ=164272800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B119D9A0000000001030307) Oct 14 05:48:35 localhost systemd[1]: var-lib-containers-storage-overlay-f2e190ad6809837b5cab304a57a4ee2a4332703e3d64c552aa0dce906fb85119-merged.mount: Deactivated successfully. Oct 14 05:48:35 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:35 localhost nova_compute[238069]: 2025-10-14 09:48:35.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:36 localhost python3.9[254233]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Oct 14 05:48:36 localhost python3.9[254343]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman Oct 14 05:48:37 localhost nova_compute[238069]: 2025-10-14 09:48:37.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:37 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Oct 14 05:48:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58828 DF PROTO=TCP SPT=53496 DPT=9105 SEQ=3188024918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11AABF0000000001030307) Oct 14 05:48:39 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:39 localhost systemd[1]: var-lib-containers-storage-overlay-32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9-merged.mount: Deactivated successfully. Oct 14 05:48:39 localhost systemd[1]: var-lib-containers-storage-overlay-32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9-merged.mount: Deactivated successfully. Oct 14 05:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:48:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58829 DF PROTO=TCP SPT=53496 DPT=9105 SEQ=3188024918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11AEDA0000000001030307) Oct 14 05:48:40 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:40 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:48:40 localhost systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully. Oct 14 05:48:40 localhost nova_compute[238069]: 2025-10-14 09:48:40.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:40 localhost podman[254356]: 2025-10-14 09:48:40.876408071 +0000 UTC m=+1.210415466 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:48:40 localhost podman[254356]: 2025-10-14 09:48:40.887373743 +0000 UTC m=+1.221381108 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 05:48:41 localhost systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully. Oct 14 05:48:41 localhost python3.9[254483]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:41 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:41 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:48:41 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:48:41 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:41 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:48:41 localhost systemd[1]: Started libpod-conmon-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.scope. Oct 14 05:48:41 localhost podman[254484]: 2025-10-14 09:48:41.868349775 +0000 UTC m=+0.370291125 container exec 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 05:48:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58830 DF PROTO=TCP SPT=53496 DPT=9105 SEQ=3188024918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11B6DB0000000001030307) Oct 14 05:48:41 localhost podman[254484]: 2025-10-14 09:48:41.904167499 +0000 UTC m=+0.406108869 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 05:48:42 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:42 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:48:42 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:48:42 localhost nova_compute[238069]: 2025-10-14 09:48:42.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:43 localhost python3.9[254667]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:43 localhost systemd[1]: libpod-conmon-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.scope: Deactivated successfully. Oct 14 05:48:43 localhost podman[254584]: 2025-10-14 09:48:43.104272141 +0000 UTC m=+0.439776267 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:48:43 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:43 localhost systemd[1]: Started libpod-conmon-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.scope. Oct 14 05:48:43 localhost podman[254669]: 2025-10-14 09:48:43.18714548 +0000 UTC m=+0.141082771 container exec 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:48:43 localhost podman[254584]: 2025-10-14 09:48:43.194520102 +0000 UTC m=+0.530024278 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:48:43 localhost podman[254584]: unhealthy Oct 14 05:48:43 localhost podman[254669]: 2025-10-14 09:48:43.225986844 +0000 UTC m=+0.179924125 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:48:44 localhost systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully. Oct 14 05:48:44 localhost podman[248187]: time="2025-10-14T09:48:44Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\"" Oct 14 05:48:44 localhost podman[248187]: @ - - [14/Oct/2025:09:43:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1" Oct 14 05:48:44 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:48:44 localhost systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully. Oct 14 05:48:44 localhost systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully. Oct 14 05:48:44 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:48:44 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:48:45 localhost nova_compute[238069]: 2025-10-14 09:48:45.366 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:45 localhost nova_compute[238069]: 2025-10-14 09:48:45.367 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:45 localhost nova_compute[238069]: 2025-10-14 09:48:45.367 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:45 localhost nova_compute[238069]: 2025-10-14 09:48:45.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58831 DF PROTO=TCP SPT=53496 DPT=9105 SEQ=3188024918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11C69A0000000001030307) Oct 14 05:48:46 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:46 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:48:46 localhost python3.9[254845]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:48:46 localhost systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully. Oct 14 05:48:46 localhost systemd[1]: libpod-conmon-89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.scope: Deactivated successfully. Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:48:47 localhost python3.9[254955]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.492 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.492 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.492 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.492 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:48:47 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:47 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:47 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.911 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.931 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.931 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:48:47 localhost nova_compute[238069]: 2025-10-14 09:48:47.932 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.050 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.051 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.051 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.052 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.052 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:48:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34421 DF PROTO=TCP SPT=34350 DPT=9101 SEQ=1986728761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11CF5A0000000001030307) Oct 14 05:48:48 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:48 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:48 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.539 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.619 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.619 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.769 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.769 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12213MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.770 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.770 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.866 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.866 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.867 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:48:48 localhost nova_compute[238069]: 2025-10-14 09:48:48.923 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:48:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:48:49 localhost nova_compute[238069]: 2025-10-14 09:48:49.331 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:48:49 localhost nova_compute[238069]: 2025-10-14 09:48:49.340 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:48:49 localhost nova_compute[238069]: 2025-10-14 09:48:49.362 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:48:49 localhost nova_compute[238069]: 2025-10-14 09:48:49.365 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:48:49 localhost nova_compute[238069]: 2025-10-14 09:48:49.365 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:48:49 localhost podman[255077]: 2025-10-14 09:48:49.362819154 +0000 UTC m=+0.088056365 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:48:49 localhost podman[255077]: 2025-10-14 09:48:49.397225804 +0000 UTC m=+0.122463025 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible) Oct 14 05:48:49 localhost python3.9[255141]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.813 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.814 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.841 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.842 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc808e4e-fd22-4052-b708-ec9f0d3f8713', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.815166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc5b8824-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': '2624d18d3556faa8d3643f21e52f3db26b4b36624cb3196a21105700d71041eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.815166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc5b9d1e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': '258af1e6922708f9c6ebe6fcb9470d01a7c3ed5d254912ca1589e17734c2cded'}]}, 'timestamp': '2025-10-14 09:48:49.842558', '_unique_id': '44cdb9edfe184a0d9410386e4fc3a82a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.844 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.852 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd80b0258-c561-4308-b1b4-dad76c69c267', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.845564', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc5d407e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': 'f64312fe7b1ac95882c6e1df93197eeec545509e3566c09bcd2212056b4142c0'}]}, 'timestamp': '2025-10-14 09:48:49.853293', '_unique_id': '9c22754020de40aeadae6523a22d45b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.855 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.856 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7a30d42-2132-40fd-8125-0fad24726e84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.855934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc5dbafe-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': '095b309586f1ea48a350d86341cfeb22766787b57460001d4e59b3e63abfc994'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.855934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc5dcbc0-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': 'c274bbcff51b7c784d809eb35603698e6e017afdcea5ac5fb7c130630fca40c2'}]}, 'timestamp': '2025-10-14 09:48:49.856847', '_unique_id': '4974f8fb304545d1aa07a6771b0088f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.859 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.859 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c10c6403-14e5-43ed-bd0d-63570bf8a6e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.859181', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc5e39fc-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': 'b110ec7af872f61849bfe7b1dbb6aa30b6738d3d2e87f4c8a9f5c8f2db6623ab'}]}, 'timestamp': '2025-10-14 09:48:49.859696', '_unique_id': '8e9c295c711f4881a211fd502242b109'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.861 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8162ff8a-6c12-4b9e-99aa-975b4234a0ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.861886', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc5ea36a-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': 'cb23c562d15e3830405be66aa3fa4d7fb0c0df6c6d9598866fe0ff354584f027'}]}, 'timestamp': '2025-10-14 09:48:49.862432', '_unique_id': '953df99610ab43bf9900a45f71c36c1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.863 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.864 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c23a690c-1b12-400c-a9ad-6f6b162ec3a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.864745', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc5f1340-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': 'f2995b9185ecc92bbe868177ef8ab8e1566bb38a652e42fe9a24fb711c0b7a58'}]}, 'timestamp': '2025-10-14 09:48:49.865230', '_unique_id': '8ee5a4d7cf444618aacf1be34b167ad4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.866 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.890 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dcd90d3-5619-4280-a1a3-84c719521219', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.867641', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc6313aa-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.060378406, 'message_signature': '61476ca43bbb0201e085f503d6b639cbbd4f7298651e010d231b39525936273c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.867641', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc63280e-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.060378406, 'message_signature': '605b7381431ffb922e313cacac212d9d7a0a4bdd79a94c001277d723fe48ff39'}]}, 'timestamp': '2025-10-14 09:48:49.891964', '_unique_id': '54f0d57226ac46f19908b26f7c208e39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.894 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.895 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d4b5cdc-12a1-4e6d-89e7-6c89d0462fb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.894716', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc63a66c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': 'b3ea1d6797437bb8e3326b9e3ae9706b117c6269c2f5fc247420533c94b3bb87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.894716', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc63b6e8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': 'e7f0cd59334276305552316ee96aef1a1d574c7d5f750ecb8aa9e16e2bf272be'}]}, 'timestamp': '2025-10-14 09:48:49.895603', '_unique_id': 'ffa0451074394bb884cbc6fcc9492f20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.897 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.898 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.898 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.898 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb613ca4-b605-4500-af6f-d726d1b0166e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.898204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc642dbc-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': 'e445c770b63ec3f60fe584378c567a370a97ac8f55018ecf72f19c01b1ce53a5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.898204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc643f50-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': 'aefcefe002a1a2bed0b3d93ce4df8ac441ae0d97eb3edbd109fd7789db991ac7'}]}, 'timestamp': '2025-10-14 09:48:49.899097', '_unique_id': 'bff2b336945d4b2a9ae3897401f6eaa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.901 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.901 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de984787-eb32-48a8-a2db-8fe58b6706d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.901301', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc64a724-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': 'b0dc3f128304481c67b6fd475a88f81cea55db49d1142e3fb272dcbb8e2052cc'}]}, 'timestamp': '2025-10-14 09:48:49.901811', '_unique_id': '23de0c29260b402da8ba41c19cfc1745'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.934 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 62340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0453ed30-b07a-4a3d-a527-8586125eb639', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62340000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:48:49.904540', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fc69be62-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.126905209, 'message_signature': '0e8ec856d371acdf2ffc735ba99521ac9027c88616220e5213dff3ecd2b59eb7'}]}, 'timestamp': '2025-10-14 09:48:49.935287', '_unique_id': 'f4c03b377f194a8aab90b50070c7f2a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4acc0fd7-fcbc-48b7-9e81-33b49f3186e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.938178', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc6a479c-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.060378406, 'message_signature': '49d5d34ee6a448f2fd98653a7127a053515cbd6a1a7f3dd8e9b47737790fa58a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.938178', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc6a5976-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.060378406, 'message_signature': '29216fa6eb309acb19b1760da94499dcf79d70b54a0f309c07c08cc7a1c7d7ee'}]}, 'timestamp': '2025-10-14 09:48:49.939089', '_unique_id': '63b03b39c5214d6bb7961b912a37fa61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.941 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.941 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a80ab305-59f5-4a08-b37a-03b27f9d3f42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.941289', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc6ac0b4-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': '55d25768f1d8c81adb0a4beb94f6a54d502276bbd7f85085384665aa92694cff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.941289', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc6ad374-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': '13d15700b90ceb0e816622b56fe44fb9080dab89caacb892bde01e366e484eb6'}]}, 'timestamp': '2025-10-14 09:48:49.942215', '_unique_id': '5bbb7962f93949859573f4e72ba80c45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.944 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.945 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a58886cd-579a-4b5f-98fd-669aa763baef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.944753', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc6b48cc-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': 'bbfa6a96ed86c362b970eb66752c9fb3e7cb532b5c513c63d51c799441192b3e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.944753', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc6b5ca4-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.007873517, 'message_signature': '8c0faf13c52bfe6913571a06c74062b150ece4ad6873f082e41b0b14dc8a7317'}]}, 'timestamp': '2025-10-14 09:48:49.945789', '_unique_id': 'ef6a32c19eca4c33bcc6e6ac63ded75d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.947 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.948 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf790cbf-ad68-4fb0-b63e-63640d8cabdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:48:49.948039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fc6bc892-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.126905209, 'message_signature': '25de3f86047e811e91ee7fe2b82ba4d438da07c24e3232ef1fea89e4d16d0390'}]}, 'timestamp': '2025-10-14 09:48:49.948496', '_unique_id': '5a3628c5b3504243b1ea45e8ee7732f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8006529b-314a-4f5f-87e0-ca044fe41651', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.950641', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc6c2fa8-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': '921da275f4bef21cbe552174a0a882f3651becec8f58c37110bbe7db65675999'}]}, 'timestamp': '2025-10-14 09:48:49.951152', '_unique_id': '830c28c3e12a497b93b96d0f113d2da8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '800bec39-bd9a-4ece-97a8-a929351d3bd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.953286', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc6c9588-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': '3567a93aea1e5bba0bd57234f9891c4d16faaa47f71008e7ded6f98abffd5010'}]}, 'timestamp': '2025-10-14 09:48:49.953788', '_unique_id': 'fa12627bf6a742ccb8abe0564572a670'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.956 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d91d324-d3d4-4b69-86ad-f61014f8b30b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.956113', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc6d03ce-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': '6f438af0f7fb8fd3c49aeac88f4fe96fbaf69decbd5a13e50666fb90cf3a0183'}]}, 'timestamp': '2025-10-14 09:48:49.956583', '_unique_id': 'afa8df037cb8480b87ec35fc6c7800cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.958 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6a58df5-0c31-43ff-9a88-24a7417ba645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.958763', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc6d6b70-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': '128587ec92e3b5018316b7d6e89a26313136d9dbabaa012a6414c56e8ae53460'}]}, 'timestamp': '2025-10-14 09:48:49.959233', '_unique_id': '998fd64dfc574ad5a3dcdb47a2c570f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.961 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.961 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.961 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f8f054e-b479-4807-9fed-c8ce835ddc40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:48:49.961362', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc6dd092-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.060378406, 'message_signature': '96c3159b966ce524a97dede9591dbf7a23cf50c00797df3e0be2999a68b57151'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:48:49.961362', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc6def5a-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.060378406, 'message_signature': '5b10e43588a954d0fe24aa3594887561db255e7eaf422c974ee626f1b1bb3ce9'}]}, 'timestamp': '2025-10-14 09:48:49.962644', '_unique_id': 'eeba5a253e89448981b267b955d6d906'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.965 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdc1c4ec-26c8-4e56-8943-3399aa2842f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8696, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:48:49.965093', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'fc6e60ca-a8e2-11f0-9707-fa163e99780b', 'monotonic_time': 11346.038295907, 'message_signature': 'ad87679e48282e0044acefc17d150743d7a70741efe8604472684b8b007a4cc6'}]}, 'timestamp': '2025-10-14 09:48:49.965436', '_unique_id': '446a0453a0d2490da6a69cd4373b4b1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:48:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:48:49.966 12 ERROR oslo_messaging.notify.messaging Oct 14 05:48:50 localhost nova_compute[238069]: 2025-10-14 09:48:50.366 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:48:50 localhost nova_compute[238069]: 2025-10-14 09:48:50.366 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:48:50 localhost systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully. Oct 14 05:48:50 localhost systemd[1]: var-lib-containers-storage-overlay-215025152e7486dca6aa506e7e941c98eca167be4a4853b2a3771ef4f2b39afc-merged.mount: Deactivated successfully. Oct 14 05:48:50 localhost systemd[1]: var-lib-containers-storage-overlay-215025152e7486dca6aa506e7e941c98eca167be4a4853b2a3771ef4f2b39afc-merged.mount: Deactivated successfully. Oct 14 05:48:50 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:48:50 localhost systemd[1]: Started libpod-conmon-aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.scope. Oct 14 05:48:50 localhost podman[255142]: 2025-10-14 09:48:50.698298023 +0000 UTC m=+1.024704887 container exec aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:48:50 localhost podman[255142]: 2025-10-14 09:48:50.729424285 +0000 UTC m=+1.055831179 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:48:50 localhost nova_compute[238069]: 2025-10-14 09:48:50.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:52 localhost nova_compute[238069]: 2025-10-14 09:48:52.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:53 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:53 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8696 DF PROTO=TCP SPT=40962 DPT=9102 SEQ=3058975961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11E4440000000001030307) Oct 14 05:48:53 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:48:54 localhost python3.9[255292]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:48:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8697 DF PROTO=TCP SPT=40962 DPT=9102 SEQ=3058975961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11E85A0000000001030307) Oct 14 05:48:55 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:55 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:55 localhost nova_compute[238069]: 2025-10-14 09:48:55.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:55 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:48:55 localhost systemd[1]: libpod-conmon-aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.scope: Deactivated successfully. Oct 14 05:48:56 localhost systemd[1]: Started libpod-conmon-aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.scope. Oct 14 05:48:56 localhost podman[255293]: 2025-10-14 09:48:56.041966072 +0000 UTC m=+1.638560371 container exec aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:48:56 localhost podman[255278]: 2025-10-14 09:48:56.073715393 +0000 UTC m=+1.908890101 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:48:56 localhost podman[255293]: 2025-10-14 09:48:56.087136779 +0000 UTC m=+1.683731158 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:48:56 localhost podman[255278]: 2025-10-14 09:48:56.116534979 +0000 UTC m=+1.951709687 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:48:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:48:57 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:57 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:57 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:48:57 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:48:57 localhost podman[255333]: 2025-10-14 09:48:57.308432883 +0000 UTC m=+0.897081375 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:48:57 localhost podman[255333]: 2025-10-14 09:48:57.32089549 +0000 UTC m=+0.909543952 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Oct 14 05:48:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:48:57.756 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:48:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:48:57.757 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:48:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:48:57.759 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:48:57 localhost nova_compute[238069]: 2025-10-14 09:48:57.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:48:57 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:48:57 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:58 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:48:58 localhost systemd[1]: libpod-conmon-aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.scope: Deactivated successfully. Oct 14 05:48:58 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:48:58 localhost python3.9[255462]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:48:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63400 DF PROTO=TCP SPT=43218 DPT=9882 SEQ=2410713525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B11F70A0000000001030307) Oct 14 05:48:58 localhost python3.9[255573]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman Oct 14 05:48:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:48:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:49:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8699 DF PROTO=TCP SPT=40962 DPT=9102 SEQ=3058975961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12001A0000000001030307) Oct 14 05:49:00 localhost nova_compute[238069]: 2025-10-14 09:49:00.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:00 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:49:01 localhost systemd[1]: var-lib-containers-storage-overlay-b323c7b12cc075908cfc59295640d329582d5902bacc1e83223968c79062b1a1-merged.mount: Deactivated successfully. Oct 14 05:49:01 localhost systemd[1]: var-lib-containers-storage-overlay-b323c7b12cc075908cfc59295640d329582d5902bacc1e83223968c79062b1a1-merged.mount: Deactivated successfully. Oct 14 05:49:01 localhost podman[255587]: 2025-10-14 09:49:01.341910909 +0000 UTC m=+1.670961031 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9) Oct 14 05:49:01 localhost podman[255587]: 2025-10-14 09:49:01.35812997 +0000 UTC m=+1.687180122 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 05:49:01 localhost podman[255586]: 2025-10-14 09:49:01.437378847 +0000 UTC m=+1.766570713 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 05:49:01 localhost podman[255586]: 2025-10-14 09:49:01.511346486 +0000 UTC m=+1.840538382 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:49:02 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:02 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:49:02 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:49:02 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:49:02 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:49:02 localhost nova_compute[238069]: 2025-10-14 09:49:02.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:03 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:03 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:49:03 localhost python3.9[255741]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:49:03 localhost systemd[1]: Started libpod-conmon-0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.scope. Oct 14 05:49:03 localhost podman[255742]: 2025-10-14 09:49:03.723359747 +0000 UTC m=+0.116723242 container exec 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:49:03 localhost podman[255742]: 2025-10-14 09:49:03.755594823 +0000 UTC m=+0.148958288 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:49:03 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:49:04 localhost systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully. Oct 14 05:49:04 localhost systemd[1]: var-lib-containers-storage-overlay-d1668dbabecea61a717977938a99d4a46ffa99afa4505047a6e5a86838675946-merged.mount: Deactivated successfully. Oct 14 05:49:04 localhost systemd[1]: var-lib-containers-storage-overlay-d1668dbabecea61a717977938a99d4a46ffa99afa4505047a6e5a86838675946-merged.mount: Deactivated successfully. Oct 14 05:49:05 localhost python3.9[255880]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:49:05 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:05 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:49:05 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:49:05 localhost systemd[1]: libpod-conmon-0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.scope: Deactivated successfully. Oct 14 05:49:05 localhost systemd[1]: Started libpod-conmon-0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.scope. Oct 14 05:49:05 localhost podman[255881]: 2025-10-14 09:49:05.308793399 +0000 UTC m=+0.244754456 container exec 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:49:05 localhost podman[255881]: 2025-10-14 09:49:05.342314303 +0000 UTC m=+0.278275320 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:49:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63403 DF PROTO=TCP SPT=43218 DPT=9882 SEQ=2410713525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1212DA0000000001030307) Oct 14 05:49:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:49:05 localhost nova_compute[238069]: 2025-10-14 09:49:05.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:06 localhost podman[255911]: 2025-10-14 09:49:06.044895053 +0000 UTC m=+0.380333200 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:49:06 localhost podman[255911]: 2025-10-14 09:49:06.056149513 +0000 UTC m=+0.391587650 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 05:49:06 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:49:06 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:49:06 localhost systemd[1]: libpod-conmon-0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.scope: Deactivated successfully. Oct 14 05:49:06 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:49:06 localhost python3.9[256038]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:07 localhost systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully. Oct 14 05:49:07 localhost systemd[1]: var-lib-containers-storage-overlay-66141e0355e434a1428da5b2027ef6192344d1c6afa950636647476e8925671b-merged.mount: Deactivated successfully. Oct 14 05:49:07 localhost systemd[1]: var-lib-containers-storage-overlay-66141e0355e434a1428da5b2027ef6192344d1c6afa950636647476e8925671b-merged.mount: Deactivated successfully. Oct 14 05:49:07 localhost python3.9[256162]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman Oct 14 05:49:07 localhost nova_compute[238069]: 2025-10-14 09:49:07.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:08 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:49:08 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:49:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29316 DF PROTO=TCP SPT=59202 DPT=9105 SEQ=1763317719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B121FEF0000000001030307) Oct 14 05:49:09 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:09 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:49:09 localhost systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully. Oct 14 05:49:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29317 DF PROTO=TCP SPT=59202 DPT=9105 SEQ=1763317719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1223DB0000000001030307) Oct 14 05:49:10 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:10 localhost python3.9[256285]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:49:10 localhost systemd[1]: Started libpod-conmon-799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.scope. Oct 14 05:49:10 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:49:10 localhost podman[256286]: 2025-10-14 09:49:10.304255973 +0000 UTC m=+0.117448594 container exec 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm) Oct 14 05:49:10 localhost podman[256286]: 2025-10-14 09:49:10.339295773 +0000 UTC m=+0.152488364 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal) Oct 14 05:49:10 localhost systemd[1]: libpod-conmon-799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.scope: Deactivated successfully. Oct 14 05:49:10 localhost nova_compute[238069]: 2025-10-14 09:49:10.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:11 localhost python3.9[256425]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Oct 14 05:49:11 localhost systemd[1]: Started libpod-conmon-799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.scope. Oct 14 05:49:11 localhost podman[256426]: 2025-10-14 09:49:11.176513416 +0000 UTC m=+0.085732865 container exec 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 05:49:11 localhost systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully. Oct 14 05:49:11 localhost podman[256426]: 2025-10-14 09:49:11.204668788 +0000 UTC m=+0.113888267 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm) Oct 14 05:49:11 localhost systemd[1]: var-lib-containers-storage-overlay-3c56646706fff247676980ac78d7924c31221bb364af528e25b8eedf875d177e-merged.mount: Deactivated successfully. Oct 14 05:49:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29318 DF PROTO=TCP SPT=59202 DPT=9105 SEQ=1763317719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B122BDA0000000001030307) Oct 14 05:49:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:49:12 localhost python3.9[256582]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:12 localhost nova_compute[238069]: 2025-10-14 09:49:12.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:13 localhost python3.9[256702]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:13 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:49:13 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:49:14 localhost python3.9[256812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:14 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:49:14 localhost systemd[1]: libpod-conmon-799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.scope: Deactivated successfully. Oct 14 05:49:14 localhost podman[256581]: 2025-10-14 09:49:14.286125158 +0000 UTC m=+2.399225778 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:49:14 localhost podman[256581]: 2025-10-14 09:49:14.321202338 +0000 UTC m=+2.434302968 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:49:14 localhost python3.9[256907]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760435353.5109878-3170-175408482883955/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:49:15 localhost nova_compute[238069]: 2025-10-14 09:49:15.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29319 DF PROTO=TCP SPT=59202 DPT=9105 SEQ=1763317719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B123B9A0000000001030307) Oct 14 05:49:15 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:49:16 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:49:16 localhost python3.9[257029]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:16 localhost systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully. Oct 14 05:49:16 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:49:16 localhost podman[256996]: 2025-10-14 09:49:16.496216161 +0000 UTC m=+0.834327017 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:49:16 localhost podman[256996]: 2025-10-14 09:49:16.512976068 +0000 UTC m=+0.851086944 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:49:16 localhost podman[256996]: unhealthy Oct 14 05:49:17 localhost python3.9[257150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:17 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:17 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:49:17 localhost systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully. Oct 14 05:49:17 localhost python3.9[257207]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:17 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:49:17 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Failed with result 'exit-code'. Oct 14 05:49:17 localhost nova_compute[238069]: 2025-10-14 09:49:17.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26772 DF PROTO=TCP SPT=57426 DPT=9101 SEQ=2476972697 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12449A0000000001030307) Oct 14 05:49:18 localhost systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully. Oct 14 05:49:18 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:18 localhost systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully. Oct 14 05:49:18 localhost podman[257295]: 2025-10-14 09:49:18.952570446 +0000 UTC m=+0.102446782 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 05:49:19 localhost podman[257295]: 2025-10-14 09:49:19.031632388 +0000 UTC m=+0.181508744 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 05:49:19 localhost python3.9[257348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:19 localhost python3.9[257417]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.n_431mib recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:20 localhost python3.9[257527]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:49:20 localhost nova_compute[238069]: 2025-10-14 09:49:20.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:20 localhost python3.9[257585]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:21 localhost systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully. Oct 14 05:49:21 localhost systemd[1]: var-lib-containers-storage-overlay-32472dd13182638cf4e918dd146bf2d7453b4f75ed169a2fe8fd3591fc0a1be9-merged.mount: Deactivated successfully. Oct 14 05:49:21 localhost python3.9[257723]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:49:22 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:49:22 localhost systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully. Oct 14 05:49:22 localhost python3[257834]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Oct 14 05:49:22 localhost systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully. Oct 14 05:49:22 localhost nova_compute[238069]: 2025-10-14 09:49:22.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34179 DF PROTO=TCP SPT=36362 DPT=9102 SEQ=2244789403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1259750000000001030307) Oct 14 05:49:23 localhost python3.9[257944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:23 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:49:23 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:49:23 localhost systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully. Oct 14 05:49:24 localhost python3.9[258001]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:24 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:49:24 localhost systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully. Oct 14 05:49:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34180 DF PROTO=TCP SPT=36362 DPT=9102 SEQ=2244789403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B125D9B0000000001030307) Oct 14 05:49:24 localhost python3.9[258127]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:25 localhost systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully. Oct 14 05:49:25 localhost python3.9[258184]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:25 localhost systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully. Oct 14 05:49:25 localhost podman[248187]: @ - - [14/Oct/2025:09:43:21 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 132430 "" "Go-http-client/1.1" Oct 14 05:49:25 localhost podman_exporter[248479]: ts=2025-10-14T09:49:25.462Z caller=exporter.go:96 level=info msg="Listening on" address=:9882 Oct 14 05:49:25 localhost podman_exporter[248479]: ts=2025-10-14T09:49:25.463Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882 Oct 14 05:49:25 localhost podman_exporter[248479]: ts=2025-10-14T09:49:25.463Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882 Oct 14 05:49:25 localhost podman[257584]: 2025-10-14 09:49:25.491759909 +0000 UTC m=+4.700906062 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:49:25 localhost podman[257584]: 2025-10-14 09:49:25.527218872 +0000 UTC m=+4.736365015 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:49:25 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:49:25 localhost nova_compute[238069]: 2025-10-14 09:49:25.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:25 localhost python3.9[258338]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:26 localhost python3.9[258419]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:27 localhost python3.9[258555]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:49:27 localhost podman[258612]: 2025-10-14 09:49:27.561947279 +0000 UTC m=+0.091612303 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:49:27 localhost podman[258612]: 2025-10-14 09:49:27.59634754 +0000 UTC m=+0.126012624 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:49:27 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:49:27 localhost python3.9[258613]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:27 localhost nova_compute[238069]: 2025-10-14 09:49:27.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2694 DF PROTO=TCP SPT=50096 DPT=9882 SEQ=1701187361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B126C3B0000000001030307) Oct 14 05:49:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:49:28 localhost podman[258652]: 2025-10-14 09:49:28.744057828 +0000 UTC m=+0.080099575 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd) Oct 14 05:49:28 localhost podman[258652]: 2025-10-14 09:49:28.760501585 +0000 UTC m=+0.096543312 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:49:28 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:49:29 localhost python3.9[258762]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:29 localhost python3.9[258852]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760435368.6900873-3544-160377471913081/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34182 DF PROTO=TCP SPT=36362 DPT=9102 SEQ=2244789403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12755A0000000001030307) Oct 14 05:49:30 localhost nova_compute[238069]: 2025-10-14 09:49:30.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:31 localhost python3.9[258962]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:32 localhost python3.9[259072]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:49:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:49:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:49:32 localhost podman[259132]: 2025-10-14 09:49:32.754153665 +0000 UTC m=+0.090038015 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41) Oct 14 05:49:32 localhost podman[259131]: 2025-10-14 09:49:32.806236652 +0000 UTC m=+0.144077651 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:49:32 localhost podman[259132]: 2025-10-14 09:49:32.817284966 +0000 UTC m=+0.153169316 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 05:49:32 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:49:32 localhost podman[259131]: 2025-10-14 09:49:32.908043562 +0000 UTC m=+0.245884561 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:49:32 localhost nova_compute[238069]: 2025-10-14 09:49:32.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:32 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:49:33 localhost python3.9[259227]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:33 localhost python3.9[259337]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:49:34 localhost python3.9[259448]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:49:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2697 DF PROTO=TCP SPT=50096 DPT=9882 SEQ=1701187361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12881A0000000001030307) Oct 14 05:49:35 localhost python3.9[259560]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:49:35 localhost nova_compute[238069]: 2025-10-14 09:49:35.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:36 localhost python3.9[259673]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:49:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:49:36 localhost systemd[1]: tmp-crun.oxGflp.mount: Deactivated successfully. Oct 14 05:49:36 localhost podman[259691]: 2025-10-14 09:49:36.750626272 +0000 UTC m=+0.085361395 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:49:36 localhost podman[259691]: 2025-10-14 09:49:36.758124859 +0000 UTC m=+0.092859972 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 05:49:36 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:49:36 localhost systemd[1]: session-58.scope: Deactivated successfully. Oct 14 05:49:36 localhost systemd[1]: session-58.scope: Consumed 32.989s CPU time. Oct 14 05:49:36 localhost systemd-logind[760]: Session 58 logged out. Waiting for processes to exit. Oct 14 05:49:36 localhost systemd-logind[760]: Removed session 58. Oct 14 05:49:37 localhost nova_compute[238069]: 2025-10-14 09:49:37.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:38 localhost openstack_network_exporter[250374]: ERROR 09:49:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:49:38 localhost openstack_network_exporter[250374]: ERROR 09:49:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:49:38 localhost openstack_network_exporter[250374]: ERROR 09:49:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:49:38 localhost openstack_network_exporter[250374]: Oct 14 05:49:38 localhost openstack_network_exporter[250374]: ERROR 09:49:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:49:38 localhost openstack_network_exporter[250374]: ERROR 09:49:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:49:38 localhost openstack_network_exporter[250374]: Oct 14 05:49:40 localhost nova_compute[238069]: 2025-10-14 09:49:40.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:41 localhost nova_compute[238069]: 2025-10-14 09:49:41.020 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:42 localhost nova_compute[238069]: 2025-10-14 09:49:42.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:44 localhost nova_compute[238069]: 2025-10-14 09:49:44.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:44 localhost sshd[259715]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:49:44 localhost systemd-logind[760]: New session 59 of user zuul. Oct 14 05:49:44 localhost systemd[1]: Started Session 59 of User zuul. Oct 14 05:49:45 localhost python3.9[259828]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:45 localhost python3.9[259938]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:45 localhost nova_compute[238069]: 2025-10-14 09:49:45.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:46 localhost nova_compute[238069]: 2025-10-14 09:49:46.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:46 localhost python3.9[260048]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:49:46 localhost podman[260066]: 2025-10-14 09:49:46.737082273 +0000 UTC m=+0.079199678 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:49:46 localhost podman[260066]: 2025-10-14 09:49:46.747133597 +0000 UTC m=+0.089251002 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 05:49:46 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:49:47 localhost nova_compute[238069]: 2025-10-14 09:49:47.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:47 localhost nova_compute[238069]: 2025-10-14 09:49:47.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:47 localhost python3.9[260175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:47 localhost nova_compute[238069]: 2025-10-14 09:49:47.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:49:48 localhost python3.9[260261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435386.698587-106-19245034402264/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.669 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.670 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.670 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:49:48 localhost nova_compute[238069]: 2025-10-14 09:49:48.671 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:49:48 localhost systemd[1]: tmp-crun.3AcXBu.mount: Deactivated successfully. Oct 14 05:49:48 localhost podman[260370]: 2025-10-14 09:49:48.744655227 +0000 UTC m=+0.085119246 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:49:48 localhost podman[260370]: 2025-10-14 09:49:48.753939539 +0000 UTC m=+0.094403598 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:49:48 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:49:48 localhost python3.9[260369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:49 localhost python3.9[260478]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435388.3377948-151-7725950792159/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.682 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.707 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.707 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.708 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.708 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.725 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.726 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.726 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.727 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:49:49 localhost nova_compute[238069]: 2025-10-14 09:49:49.728 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:49:49 localhost python3.9[260587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.128 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.193 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.193 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.391 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.392 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12283MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.392 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.392 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.464 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.464 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.465 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.500 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:49:50 localhost python3.9[260694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435389.5506783-151-31877558058541/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.980 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:49:50 localhost nova_compute[238069]: 2025-10-14 09:49:50.986 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:49:51 localhost nova_compute[238069]: 2025-10-14 09:49:51.004 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:49:51 localhost nova_compute[238069]: 2025-10-14 09:49:51.007 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:49:51 localhost nova_compute[238069]: 2025-10-14 09:49:51.008 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:49:51 localhost python3.9[260822]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:51 localhost python3.9[260910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435390.6824806-151-54645619138085/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=b3dce1947cf595f51351b3ff30764668abddbff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:52 localhost nova_compute[238069]: 2025-10-14 09:49:52.004 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:52 localhost nova_compute[238069]: 2025-10-14 09:49:52.005 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:49:52 localhost nova_compute[238069]: 2025-10-14 09:49:52.005 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:49:52 localhost python3.9[261018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:53 localhost nova_compute[238069]: 2025-10-14 09:49:53.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10544 DF PROTO=TCP SPT=59042 DPT=9102 SEQ=400685229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12CEA40000000001030307) Oct 14 05:49:53 localhost python3.9[261104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435392.491539-325-149765226210304/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=401f2db3441c75ad5886350294091560f714495b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:54 localhost python3.9[261212]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:49:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10545 DF PROTO=TCP SPT=59042 DPT=9102 SEQ=400685229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12D29A0000000001030307) Oct 14 05:49:55 localhost python3.9[261324]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:49:55 localhost systemd[1]: tmp-crun.9GNYt2.mount: Deactivated successfully. Oct 14 05:49:55 localhost podman[261435]: 2025-10-14 09:49:55.759888076 +0000 UTC m=+0.123552140 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:49:55 localhost podman[261435]: 2025-10-14 09:49:55.769922919 +0000 UTC m=+0.133586943 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3) Oct 14 05:49:55 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:49:55 localhost python3.9[261434]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:55 localhost nova_compute[238069]: 2025-10-14 09:49:55.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10546 DF PROTO=TCP SPT=59042 DPT=9102 SEQ=400685229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12DA9A0000000001030307) Oct 14 05:49:57 localhost python3.9[261508]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:49:57 localhost podman[261588]: 2025-10-14 09:49:57.727814971 +0000 UTC m=+0.064538414 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:49:57 localhost podman[261588]: 2025-10-14 09:49:57.74004143 +0000 UTC m=+0.076764873 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:49:57 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:49:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:49:57.757 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:49:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:49:57.757 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:49:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:49:57.759 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:49:57 localhost python3.9[261641]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:49:58 localhost nova_compute[238069]: 2025-10-14 09:49:58.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:49:58 localhost podman[248187]: time="2025-10-14T09:49:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:49:58 localhost podman[248187]: @ - - [14/Oct/2025:09:49:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 134052 "" "Go-http-client/1.1" Oct 14 05:49:58 localhost podman[248187]: @ - - [14/Oct/2025:09:49:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16465 "" "Go-http-client/1.1" Oct 14 05:49:58 localhost python3.9[261699]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:49:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:49:59 localhost systemd[1]: tmp-crun.FKYfb0.mount: Deactivated successfully. Oct 14 05:49:59 localhost podman[261756]: 2025-10-14 09:49:59.757151584 +0000 UTC m=+0.098837071 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:49:59 localhost podman[261756]: 2025-10-14 09:49:59.767105146 +0000 UTC m=+0.108790663 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 05:49:59 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:50:00 localhost python3.9[261827]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10547 DF PROTO=TCP SPT=59042 DPT=9102 SEQ=400685229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B12EA5B0000000001030307) Oct 14 05:50:00 localhost python3.9[261937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:00 localhost nova_compute[238069]: 2025-10-14 09:50:00.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:01 localhost python3.9[261994]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:01 localhost python3.9[262104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:02 localhost python3.9[262161]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:03 localhost nova_compute[238069]: 2025-10-14 09:50:03.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:50:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:50:03 localhost podman[262272]: 2025-10-14 09:50:03.451777569 +0000 UTC m=+0.084397836 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:50:03 localhost systemd[1]: tmp-crun.VsR4ZL.mount: Deactivated successfully. Oct 14 05:50:03 localhost podman[262272]: 2025-10-14 09:50:03.515914466 +0000 UTC m=+0.148534773 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 05:50:03 localhost podman[262273]: 2025-10-14 09:50:03.517885207 +0000 UTC m=+0.143947786 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm) Oct 14 05:50:03 localhost podman[262273]: 2025-10-14 09:50:03.532634134 +0000 UTC m=+0.158696753 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=) Oct 14 05:50:03 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:50:03 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:50:03 localhost python3.9[262271]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:50:03 localhost systemd[1]: Reloading. Oct 14 05:50:03 localhost systemd-rc-local-generator[262338]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:50:03 localhost systemd-sysv-generator[262341]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:50:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:50:04 localhost python3.9[262462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:05 localhost python3.9[262519]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:05 localhost nova_compute[238069]: 2025-10-14 09:50:05.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:06 localhost python3.9[262629]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:06 localhost python3.9[262686]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:50:07 localhost systemd[1]: tmp-crun.mw8G05.mount: Deactivated successfully. Oct 14 05:50:07 localhost podman[262797]: 2025-10-14 09:50:07.26584681 +0000 UTC m=+0.079316088 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 05:50:07 localhost podman[262797]: 2025-10-14 09:50:07.305017453 +0000 UTC m=+0.118486681 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:50:07 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:50:07 localhost python3.9[262796]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:50:07 localhost systemd[1]: Reloading. Oct 14 05:50:07 localhost systemd-sysv-generator[262844]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:50:07 localhost systemd-rc-local-generator[262839]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:50:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:50:07 localhost systemd[1]: Starting Create netns directory... Oct 14 05:50:07 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:50:07 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:50:07 localhost systemd[1]: Finished Create netns directory. Oct 14 05:50:08 localhost nova_compute[238069]: 2025-10-14 09:50:08.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:08 localhost openstack_network_exporter[250374]: ERROR 09:50:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:50:08 localhost openstack_network_exporter[250374]: ERROR 09:50:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:50:08 localhost openstack_network_exporter[250374]: ERROR 09:50:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:50:08 localhost openstack_network_exporter[250374]: ERROR 09:50:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:50:08 localhost openstack_network_exporter[250374]: Oct 14 05:50:08 localhost openstack_network_exporter[250374]: ERROR 09:50:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:50:08 localhost openstack_network_exporter[250374]: Oct 14 05:50:09 localhost python3.9[262969]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:09 localhost python3.9[263079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:10 localhost nova_compute[238069]: 2025-10-14 09:50:10.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:11 localhost python3.9[263167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760435409.4833362-736-78549915860605/.source.json _original_basename=.meio9c8n follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:11 localhost python3.9[263277]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:13 localhost nova_compute[238069]: 2025-10-14 09:50:13.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:14 localhost python3.9[263585]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False Oct 14 05:50:15 localhost nova_compute[238069]: 2025-10-14 09:50:15.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:15 localhost python3.9[263695]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:50:15 localhost sshd[263696]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:50:16 localhost python3.9[263807]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:50:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:50:17 localhost systemd[1]: tmp-crun.rtfScx.mount: Deactivated successfully. Oct 14 05:50:17 localhost podman[263852]: 2025-10-14 09:50:17.755925189 +0000 UTC m=+0.095122207 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true) Oct 14 05:50:17 localhost podman[263852]: 2025-10-14 09:50:17.790170951 +0000 UTC m=+0.129367949 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:50:17 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:50:18 localhost nova_compute[238069]: 2025-10-14 09:50:18.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:50:19 localhost podman[263870]: 2025-10-14 09:50:19.730663737 +0000 UTC m=+0.071821795 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:50:19 localhost podman[263870]: 2025-10-14 09:50:19.738730117 +0000 UTC m=+0.079888155 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:50:19 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:50:20 localhost nova_compute[238069]: 2025-10-14 09:50:20.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:21 localhost python3[263986]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:50:21 localhost podman[264023]: Oct 14 05:50:21 localhost podman[264023]: 2025-10-14 09:50:21.406573218 +0000 UTC m=+0.080668029 container create 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, container_name=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 05:50:21 localhost podman[264023]: 2025-10-14 09:50:21.362125372 +0000 UTC m=+0.036220233 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Oct 14 05:50:21 localhost python3[263986]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Oct 14 05:50:22 localhost python3.9[264170]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:50:23 localhost nova_compute[238069]: 2025-10-14 09:50:23.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12538 DF PROTO=TCP SPT=35948 DPT=9102 SEQ=2215421264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1343D50000000001030307) Oct 14 05:50:23 localhost python3.9[264282]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:24 localhost python3.9[264337]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:50:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12539 DF PROTO=TCP SPT=35948 DPT=9102 SEQ=2215421264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1347DA0000000001030307) Oct 14 05:50:24 localhost python3.9[264446]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435424.3463147-1000-152149787323486/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:50:25 localhost nova_compute[238069]: 2025-10-14 09:50:25.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:26 localhost python3.9[264501]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:50:26 localhost systemd[1]: Reloading. Oct 14 05:50:26 localhost podman[264503]: 2025-10-14 09:50:26.104976131 +0000 UTC m=+0.058756951 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible) Oct 14 05:50:26 localhost podman[264503]: 2025-10-14 09:50:26.118038695 +0000 UTC m=+0.071819525 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Oct 14 05:50:26 localhost systemd-rc-local-generator[264542]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:50:26 localhost systemd-sysv-generator[264546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:50:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:50:26 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:50:26 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:99:78:0b MACPROTO=0800 SRC=206.168.34.212 DST=38.102.83.143 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=279 DF PROTO=TCP SPT=60444 DPT=19885 SEQ=2396804070 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A41D7321A000000000103030A) Oct 14 05:50:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12540 DF PROTO=TCP SPT=35948 DPT=9102 SEQ=2215421264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B134FDA0000000001030307) Oct 14 05:50:27 localhost python3.9[264610]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:50:27 localhost systemd[1]: Reloading. Oct 14 05:50:27 localhost systemd-sysv-generator[264679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:50:27 localhost systemd-rc-local-generator[264673]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:50:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:50:27 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:99:78:0b MACPROTO=0800 SRC=206.168.34.212 DST=38.102.83.143 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=280 DF PROTO=TCP SPT=60444 DPT=19885 SEQ=2396804070 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A41D73614000000000103030A) Oct 14 05:50:27 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:99:78:0b MACPROTO=0800 SRC=206.168.34.212 DST=38.102.83.143 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=40439 DF PROTO=TCP SPT=60454 DPT=19885 SEQ=1853600707 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A41D7363C000000000103030A) Oct 14 05:50:27 localhost systemd[1]: Starting neutron_sriov_agent container... Oct 14 05:50:27 localhost systemd[1]: Started libcrun container. Oct 14 05:50:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/931a73aa55636b75941b7771d88c4c8c0fb3b55750697da85c1c97a731f58c4b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Oct 14 05:50:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/931a73aa55636b75941b7771d88c4c8c0fb3b55750697da85c1c97a731f58c4b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:50:27 localhost podman[264705]: 2025-10-14 09:50:27.698085597 +0000 UTC m=+0.134169837 container init 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:50:27 localhost podman[264705]: 2025-10-14 09:50:27.708945624 +0000 UTC m=+0.145029864 container start 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:50:27 localhost podman[264705]: neutron_sriov_agent Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + sudo -E kolla_set_configs Oct 14 05:50:27 localhost systemd[1]: Started neutron_sriov_agent container. Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Validating config file Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Copying service configuration files Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Writing out command to execute Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/external Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23 Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: ++ cat /run_command Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + CMD=/usr/bin/neutron-sriov-nic-agent Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + ARGS= Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + sudo kolla_copy_cacerts Oct 14 05:50:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + [[ ! -n '' ]] Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + . kolla_extend_start Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: Running command: '/usr/bin/neutron-sriov-nic-agent' Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + umask 0022 Oct 14 05:50:27 localhost neutron_sriov_agent[264729]: + exec /usr/bin/neutron-sriov-nic-agent Oct 14 05:50:27 localhost podman[264758]: 2025-10-14 09:50:27.879463255 +0000 UTC m=+0.067593874 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:50:27 localhost podman[264758]: 2025-10-14 09:50:27.913789728 +0000 UTC m=+0.101920317 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:50:27 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:50:28 localhost nova_compute[238069]: 2025-10-14 09:50:28.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:28 localhost podman[248187]: time="2025-10-14T09:50:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:50:28 localhost podman[248187]: @ - - [14/Oct/2025:09:50:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 136012 "" "Go-http-client/1.1" Oct 14 05:50:28 localhost podman[248187]: @ - - [14/Oct/2025:09:50:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16891 "" "Go-http-client/1.1" Oct 14 05:50:28 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:99:78:0b MACPROTO=0800 SRC=206.168.34.212 DST=38.102.83.143 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=40440 DF PROTO=TCP SPT=60454 DPT=19885 SEQ=1853600707 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A41D73A54000000000103030A) Oct 14 05:50:28 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:99:78:0b MACPROTO=0800 SRC=206.168.34.212 DST=38.102.83.143 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=61019 DF PROTO=TCP SPT=60468 DPT=19885 SEQ=2749478681 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A41D73AFD000000000103030A) Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.507 2 INFO neutron.common.config [-] Logging enabled!#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.507 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.508 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.508 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.508 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.508 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.508 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005486733.localdomain'}#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.509 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-c02b9450-e56b-4226-bd33-f53d2cc2a766 - - - - - -] RPC agent_id: nic-switch-agent.np0005486733.localdomain#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.513 2 INFO neutron.agent.agent_extensions_manager [None req-c02b9450-e56b-4226-bd33-f53d2cc2a766 - - - - - -] Loaded agent extensions: ['qos']#033[00m Oct 14 05:50:29 localhost neutron_sriov_agent[264729]: 2025-10-14 09:50:29.513 2 INFO neutron.agent.agent_extensions_manager [None req-c02b9450-e56b-4226-bd33-f53d2cc2a766 - - - - - -] Initializing agent extension 'qos'#033[00m Oct 14 05:50:29 localhost python3.9[264896]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:50:29 localhost systemd[1]: Stopping neutron_sriov_agent container... Oct 14 05:50:29 localhost systemd[1]: tmp-crun.MZHZOd.mount: Deactivated successfully. Oct 14 05:50:29 localhost systemd[1]: libpod-8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25.scope: Deactivated successfully. Oct 14 05:50:29 localhost systemd[1]: libpod-8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25.scope: Consumed 1.819s CPU time. Oct 14 05:50:29 localhost podman[264901]: 2025-10-14 09:50:29.751155031 +0000 UTC m=+0.072128055 container died 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Oct 14 05:50:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25-userdata-shm.mount: Deactivated successfully. Oct 14 05:50:29 localhost podman[264901]: 2025-10-14 09:50:29.799724166 +0000 UTC m=+0.120697130 container cleanup 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2) Oct 14 05:50:29 localhost podman[264901]: neutron_sriov_agent Oct 14 05:50:29 localhost podman[264913]: 2025-10-14 09:50:29.801698067 +0000 UTC m=+0.047111861 container cleanup 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, config_id=neutron_sriov_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=neutron_sriov_agent, io.buildah.version=1.41.3) Oct 14 05:50:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:50:29 localhost podman[264927]: 2025-10-14 09:50:29.876880356 +0000 UTC m=+0.049456024 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true) Oct 14 05:50:29 localhost podman[264928]: 2025-10-14 09:50:29.896609617 +0000 UTC m=+0.062343382 container cleanup 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 05:50:29 localhost podman[264928]: neutron_sriov_agent Oct 14 05:50:29 localhost systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully. Oct 14 05:50:29 localhost systemd[1]: Stopped neutron_sriov_agent container. Oct 14 05:50:29 localhost podman[264927]: 2025-10-14 09:50:29.911987463 +0000 UTC m=+0.084563141 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:50:29 localhost systemd[1]: Starting neutron_sriov_agent container... Oct 14 05:50:29 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:50:30 localhost systemd[1]: Started libcrun container. Oct 14 05:50:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/931a73aa55636b75941b7771d88c4c8c0fb3b55750697da85c1c97a731f58c4b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Oct 14 05:50:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/931a73aa55636b75941b7771d88c4c8c0fb3b55750697da85c1c97a731f58c4b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:50:30 localhost podman[264960]: 2025-10-14 09:50:30.008329697 +0000 UTC m=+0.084091635 container init 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:50:30 localhost podman[264960]: 2025-10-14 09:50:30.013628281 +0000 UTC m=+0.089390219 container start 8f9cdcd5edae45e7d072e5c639159683e77a8fbdfa0c68690957d6d5078c4c25 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df53db26f88be0dba7e9a5fdb411951bc56c96f08d3e6ab49de68c3b55962253'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:50:30 localhost podman[264960]: neutron_sriov_agent Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + sudo -E kolla_set_configs Oct 14 05:50:30 localhost systemd[1]: Started neutron_sriov_agent container. Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Validating config file Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Copying service configuration files Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Writing out command to execute Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/external Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23 Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/cd0de74397aa76b626744172300028943e2372ca220b3e27b1c7d2b66ff2832c Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: ++ cat /run_command Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + CMD=/usr/bin/neutron-sriov-nic-agent Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + ARGS= Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + sudo kolla_copy_cacerts Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + [[ ! -n '' ]] Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + . kolla_extend_start Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: Running command: '/usr/bin/neutron-sriov-nic-agent' Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + umask 0022 Oct 14 05:50:30 localhost neutron_sriov_agent[264974]: + exec /usr/bin/neutron-sriov-nic-agent Oct 14 05:50:30 localhost systemd[1]: session-59.scope: Deactivated successfully. Oct 14 05:50:30 localhost systemd[1]: session-59.scope: Consumed 24.082s CPU time. Oct 14 05:50:30 localhost systemd-logind[760]: Session 59 logged out. Waiting for processes to exit. Oct 14 05:50:30 localhost systemd-logind[760]: Removed session 59. Oct 14 05:50:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12541 DF PROTO=TCP SPT=35948 DPT=9102 SEQ=2215421264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B135F9B0000000001030307) Oct 14 05:50:30 localhost nova_compute[238069]: 2025-10-14 09:50:30.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.648 2 INFO neutron.common.config [-] Logging enabled!#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.648 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.649 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.649 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.649 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.649 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.649 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005486733.localdomain'}#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.650 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-fd4401a4-f5af-4e9b-838b-50ec0a8924c6 - - - - - -] RPC agent_id: nic-switch-agent.np0005486733.localdomain#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.654 2 INFO neutron.agent.agent_extensions_manager [None req-fd4401a4-f5af-4e9b-838b-50ec0a8924c6 - - - - - -] Loaded agent extensions: ['qos']#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.655 2 INFO neutron.agent.agent_extensions_manager [None req-fd4401a4-f5af-4e9b-838b-50ec0a8924c6 - - - - - -] Initializing agent extension 'qos'#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.875 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-fd4401a4-f5af-4e9b-838b-50ec0a8924c6 - - - - - -] Agent initialized successfully, now running... #033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.875 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-fd4401a4-f5af-4e9b-838b-50ec0a8924c6 - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Oct 14 05:50:31 localhost neutron_sriov_agent[264974]: 2025-10-14 09:50:31.875 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-fd4401a4-f5af-4e9b-838b-50ec0a8924c6 - - - - - -] Agent out of sync with plugin!#033[00m Oct 14 05:50:33 localhost nova_compute[238069]: 2025-10-14 09:50:33.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:50:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:50:33 localhost podman[265008]: 2025-10-14 09:50:33.740627715 +0000 UTC m=+0.076887732 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350) Oct 14 05:50:33 localhost podman[265008]: 2025-10-14 09:50:33.755506927 +0000 UTC m=+0.091766924 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=) Oct 14 05:50:33 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:50:33 localhost podman[265007]: 2025-10-14 09:50:33.80564146 +0000 UTC m=+0.143334662 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Oct 14 05:50:33 localhost podman[265007]: 2025-10-14 09:50:33.844362998 +0000 UTC m=+0.182056210 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009) Oct 14 05:50:33 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:50:35 localhost nova_compute[238069]: 2025-10-14 09:50:35.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:37 localhost sshd[265051]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:50:37 localhost systemd-logind[760]: New session 60 of user zuul. Oct 14 05:50:37 localhost systemd[1]: Started Session 60 of User zuul. Oct 14 05:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:50:37 localhost podman[265054]: 2025-10-14 09:50:37.435739682 +0000 UTC m=+0.096744877 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:50:37 localhost podman[265054]: 2025-10-14 09:50:37.4492257 +0000 UTC m=+0.110230835 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS) Oct 14 05:50:37 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:50:38 localhost nova_compute[238069]: 2025-10-14 09:50:38.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:38 localhost python3.9[265181]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:50:38 localhost openstack_network_exporter[250374]: ERROR 09:50:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:50:38 localhost openstack_network_exporter[250374]: ERROR 09:50:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:50:38 localhost openstack_network_exporter[250374]: Oct 14 05:50:38 localhost openstack_network_exporter[250374]: ERROR 09:50:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:50:38 localhost openstack_network_exporter[250374]: Oct 14 05:50:38 localhost openstack_network_exporter[250374]: ERROR 09:50:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:50:38 localhost openstack_network_exporter[250374]: ERROR 09:50:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:50:40 localhost python3.9[265295]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:50:40 localhost nova_compute[238069]: 2025-10-14 09:50:40.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:41 localhost python3.9[265358]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:50:43 localhost nova_compute[238069]: 2025-10-14 09:50:43.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:45 localhost nova_compute[238069]: 2025-10-14 09:50:45.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:45 localhost python3.9[265470]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Oct 14 05:50:45 localhost nova_compute[238069]: 2025-10-14 09:50:45.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:46 localhost python3.9[265583]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:46 localhost python3.9[265693]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:47 localhost nova_compute[238069]: 2025-10-14 09:50:47.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:47 localhost nova_compute[238069]: 2025-10-14 09:50:47.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:47 localhost python3.9[265803]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:50:48 localhost nova_compute[238069]: 2025-10-14 09:50:48.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:48 localhost podman[265914]: 2025-10-14 09:50:48.047786779 +0000 UTC m=+0.087123799 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 05:50:48 localhost podman[265914]: 2025-10-14 09:50:48.057974895 +0000 UTC m=+0.097311935 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 05:50:48 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:50:48 localhost python3.9[265913]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:48 localhost nova_compute[238069]: 2025-10-14 09:50:48.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:48 localhost python3.9[266042]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.019 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.023 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.023 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:50:49 localhost python3.9[266152]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.678 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.678 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.679 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:50:49 localhost nova_compute[238069]: 2025-10-14 09:50:49.679 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.813 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.814 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.838 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.838 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4e6db8d-5320-464c-ac64-4b369e431189', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.814979', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43e19512-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.007684464, 'message_signature': '0b3e981223d40ac934a754bb7ad3d148ecac6688ae14b6d3377508eb662b8cbf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.814979', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43e1aa5c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.007684464, 'message_signature': '1f418ecfc6117967b64e4429f8373211d30cf9bcefa5337b9e76068df0bffe81'}]}, 'timestamp': '2025-10-14 09:50:49.839291', '_unique_id': '0346957f20fb48cbb391f34bed951275'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.840 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.842 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.875 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 63380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19d47096-9492-448e-83d0-a44781e02128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63380000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:50:49.842174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '43e73d50-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.0678973, 'message_signature': 'cbafb8041b7b7e6bf1577ee6595dfa1ea875be9a860d29450d42a3ee4e2d06b8'}]}, 'timestamp': '2025-10-14 09:50:49.875882', '_unique_id': '805a089bfbe04fea8ce2a9f2c7b344b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:50:49 localhost podman[266262]: 2025-10-14 09:50:49.894729729 +0000 UTC m=+0.093952872 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.905 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost podman[266262]: 2025-10-14 09:50:49.90606 +0000 UTC m=+0.105283193 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7db65d13-a30c-424a-a3f0-41c78c40c54a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.879626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43ebd2d4-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': 'd0ab5c72e43d52aa916412b9245b497db55023c96d4579790ae01230fc068def'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.879626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43ebf2be-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '098fd48f8813a15ee674a8ac0440001576d11d51d77855daed8b53a5efeaab78'}]}, 'timestamp': '2025-10-14 09:50:49.906754', '_unique_id': '5fe844b673974971aab52c88f993d466'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa07fe86-42ff-4bed-8afe-6c8b80e2da19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.909968', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43ec863e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': 'f7e9b1b4aea21e77879be5b6636a84f450cee313a0cc5b7ec1b1af128ac0d3a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.909968', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43ec970a-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '073cc86c2e69ef09b29d2e981584fba693b7c2199533e44f72ffe98b671c6df4'}]}, 'timestamp': '2025-10-14 09:50:49.910915', '_unique_id': 'b48da18dab79402d93222106ccea0c02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5872b77-b944-4909-b3e1-e459a34e17ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.913293', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43edbb9e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': '4b47b4822a2734ec1ae581348937ed5a346040f7569ae489781dd0a79c43462b'}]}, 'timestamp': '2025-10-14 09:50:49.918550', '_unique_id': '069ecc775f2e4e80b0df5d1c5d3f154b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86288870-d9f9-4983-9428-c5bd22460595', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.921859', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43ee569e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '56eee3ec1daae4abffe7923ceb0818c62a23607349e6216255e13dc855116955'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.921859', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43ee672e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': 'fb495dd6d6a208e06b0629610628ec35bcf04d4abacd3d90f7f35657e968232c'}]}, 'timestamp': '2025-10-14 09:50:49.922800', '_unique_id': '4c9cd68dbb5647cbb08458f80c7d7204'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c62fc50e-78d3-404e-bead-84ee7aa31793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.925137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43eed66e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.007684464, 'message_signature': '19f73588a2a892518b92e84f0d02dcd0ed0e39d52444580b333e8714932d334f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.925137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43eeea96-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.007684464, 'message_signature': 'e1b0b938d902738032959ce36be24d0ec3dd1e2b78cdcabc19f4da21020cf8bd'}]}, 'timestamp': '2025-10-14 09:50:49.926121', '_unique_id': 'feb832162f0548bba2fad061f6126c4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a45144c7-8d21-4b0b-b85e-efec539e88f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.928538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43ef5da0-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '35c7c08e648e798432bab8e9715517a19e7c743e7e09b55dff09516679d52f8e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.928538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43ef6e94-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '854768b57f90d887823a694f07467ab27d13aca4927c96e2828c32448ba9caa5'}]}, 'timestamp': '2025-10-14 09:50:49.929488', '_unique_id': '59c2d082e00a4fa79c624de29980f8ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc20696c-b3ef-4eb7-bf91-809b4f92b661', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.932074', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43efe5c2-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': 'ae0ada8d5f83f6307328cfc69183122772268ca52521d7be1901343d677f4113'}]}, 'timestamp': '2025-10-14 09:50:49.932574', '_unique_id': '4e8a21a1e1f640e9a10e844db508d4a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.934 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '753483e6-9c82-4f47-b400-37cb3d07ab9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.934908', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f05430-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': '66071e51b5938d97c325d18f131e5d92484c383a88d058b2bcb82f62488307bb'}]}, 'timestamp': '2025-10-14 09:50:49.935400', '_unique_id': '80106919eb834f929166e045df593ccf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76ab5e9d-b143-46bf-9b6a-970d385fddff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.938622', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f0e7ce-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': '42fd7a380fb7f6a6c083d7278e0f02be1e134185c3245851f47398b34ef1be86'}]}, 'timestamp': '2025-10-14 09:50:49.939184', '_unique_id': 'd2b30a0492e94ce4968c5e0f392a7bdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.941 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.941 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8061ab13-c6c1-414e-9539-faf85a3dae47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8696, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.941633', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f15e02-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': '357b1ac3cdfa1bed7557be70518097f70d5172905afd70952c86a96bc5f53ee4'}]}, 'timestamp': '2025-10-14 09:50:49.942228', '_unique_id': '23aa1180d537411b8d7f61ba0bcf9d50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.943 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.944 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3ec58ec-e152-4f08-aebf-5e5c5314dbbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.943821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43f1ac9a-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '673d4e0041a44a4af99406847b54c5279a458f672ab970f30ad73e9f7403a728'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.943821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43f1b708-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '68f357fd7860044417f31300e57f1024e011053b2da54d4b0c9ff7939788af37'}]}, 'timestamp': '2025-10-14 09:50:49.944375', '_unique_id': '1f65d4c062f94bb9a7093bb86f7b5759'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.945 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55715679-caa4-4069-9f49-94e9b2303778', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.945822', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f1fbc8-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': 'f20b660f514957f7cd225d5091c8f79e4c4d1bdeae6b8373a20a6c11df3b51a3'}]}, 'timestamp': '2025-10-14 09:50:49.946155', '_unique_id': 'fec6b27e487f4d359dd8c3295690bf8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.947 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.947 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd19c9ebd-04cc-4cd7-8499-2d85d0d08ad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:50:49.947514', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '43f23d90-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.0678973, 'message_signature': 'db98e938bcdf8441c34610d87020584ddc9a82bc83039be7ac6e26f7cb1db6ce'}]}, 'timestamp': '2025-10-14 09:50:49.947836', '_unique_id': 'ef251a6bc08248dcae7dd94024692871'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.949 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '920c642e-ec22-4649-a008-041405c1c23c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.949272', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f2817e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': '50e9053e213cd2d750ddcf763aa115548669785e066619304bfbef057ec2e893'}]}, 'timestamp': '2025-10-14 09:50:49.949577', '_unique_id': '6be764c615684607857cd5b2692490f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56c1d910-8eb1-486c-a47a-d6aec34a4c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.951088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43f2c864-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': 'e769cd3e25a40a448e05038d5206bf5a1ef9c2a521d579e02f17bce9dd47fd3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.951088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43f2d2aa-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.072410989, 'message_signature': '2acbaf7d9f843ca4fac8a5b215b9324bd97524b4bb97ef95f3e73a774d06923a'}]}, 'timestamp': '2025-10-14 09:50:49.951658', '_unique_id': '165e6ad89f9a474d868f1fdceef17e63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b122d07-19e3-4714-a091-ecb915c77955', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.953101', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f31710-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': 'fa735a8cbda6adcf1c3dfdd12248ee67a3c737abd71ffefa376d7b83ee294652'}]}, 'timestamp': '2025-10-14 09:50:49.953697', '_unique_id': 'ee49216e48a0448db2ad9715d36862ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.955 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56371548-a30c-46ce-8418-609efe7e396d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.955228', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f36a12-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': 'dca4e447b6351f6360b0e9984c5980e2ef2b1ab5634c2b0209f97f79378392d7'}]}, 'timestamp': '2025-10-14 09:50:49.955529', '_unique_id': '5a1c8fb8635f437d807c8a0060ae3586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.956 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb25a593-46fc-4d0c-aae8-19c8e6630b70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:50:49.956940', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '43f3acf2-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.106028771, 'message_signature': 'b01076399645b8b585afed6a0ea657f69c7e0bd4b551020b637fbe51b1db4a4a'}]}, 'timestamp': '2025-10-14 09:50:49.957242', '_unique_id': 'dc951d481f1a4ba7a07253861ff4a27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.957 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.958 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.958 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0679a25a-dfee-461a-8602-7d6faaac6c69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:50:49.958648', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43f3f19e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.007684464, 'message_signature': '1fd2ec4ddeb12a438059ea0d59a03315f1ef4211b514bfd62811be09c351480b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:50:49.958648', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43f3fc02-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11466.007684464, 'message_signature': 'adcfc207cb1ab55bf5c07cced44a1103c6a86b22e28669f66e7385a46e849f95'}]}, 'timestamp': '2025-10-14 09:50:49.959248', '_unique_id': '0d11f58e5ff54dc89fdedd9adfc67961'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:50:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:50:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:50:50 localhost python3.9[266263]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:50 localhost nova_compute[238069]: 2025-10-14 09:50:50.732 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:50:50 localhost nova_compute[238069]: 2025-10-14 09:50:50.757 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:50:50 localhost nova_compute[238069]: 2025-10-14 09:50:50.757 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:50:50 localhost nova_compute[238069]: 2025-10-14 09:50:50.758 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:50 localhost nova_compute[238069]: 2025-10-14 09:50:50.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.045 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.046 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.046 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.046 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.047 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:50:51 localhost python3.9[266395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.543 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.614 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.615 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.807 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.810 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12184MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.810 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.811 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.913 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.914 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.914 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:50:51 localhost nova_compute[238069]: 2025-10-14 09:50:51.978 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:50:52 localhost nova_compute[238069]: 2025-10-14 09:50:52.392 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:50:52 localhost nova_compute[238069]: 2025-10-14 09:50:52.398 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:50:52 localhost nova_compute[238069]: 2025-10-14 09:50:52.431 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:50:52 localhost nova_compute[238069]: 2025-10-14 09:50:52.434 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:50:52 localhost nova_compute[238069]: 2025-10-14 09:50:52.434 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:50:53 localhost python3.9[266527]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435450.528989-281-240888662878354/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:53 localhost nova_compute[238069]: 2025-10-14 09:50:53.436 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:50:53 localhost nova_compute[238069]: 2025-10-14 09:50:53.437 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:50:53 localhost nova_compute[238069]: 2025-10-14 09:50:53.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23487 DF PROTO=TCP SPT=40836 DPT=9102 SEQ=1628618658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B13B9050000000001030307) Oct 14 05:50:53 localhost python3.9[266635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23488 DF PROTO=TCP SPT=40836 DPT=9102 SEQ=1628618658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B13BD1B0000000001030307) Oct 14 05:50:55 localhost python3.9[266721]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435453.2488034-325-265096243258506/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:55 localhost python3.9[266829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:55 localhost nova_compute[238069]: 2025-10-14 09:50:55.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:56 localhost python3.9[266915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435455.308362-325-156759897070317/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23489 DF PROTO=TCP SPT=40836 DPT=9102 SEQ=1628618658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B13C51B0000000001030307) Oct 14 05:50:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:50:56 localhost systemd[1]: tmp-crun.NkL6za.mount: Deactivated successfully. Oct 14 05:50:56 localhost podman[267024]: 2025-10-14 09:50:56.761095054 +0000 UTC m=+0.101071322 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid) Oct 14 05:50:56 localhost podman[267024]: 2025-10-14 09:50:56.772250679 +0000 UTC m=+0.112226997 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:50:56 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:50:56 localhost python3.9[267023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:57 localhost python3.9[267129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435456.3542638-325-15125070487507/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=84bf6421c6362464baf3d1e3cd5d2659cd7f6cd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:50:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:50:57.757 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:50:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:50:57.758 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:50:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:50:57.759 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:50:58 localhost podman[248187]: time="2025-10-14T09:50:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:50:58 localhost podman[248187]: @ - - [14/Oct/2025:09:50:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 136010 "" "Go-http-client/1.1" Oct 14 05:50:58 localhost podman[248187]: @ - - [14/Oct/2025:09:50:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16912 "" "Go-http-client/1.1" Oct 14 05:50:58 localhost nova_compute[238069]: 2025-10-14 09:50:58.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:50:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:50:58 localhost podman[267227]: 2025-10-14 09:50:58.75255541 +0000 UTC m=+0.085585603 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:50:58 localhost podman[267227]: 2025-10-14 09:50:58.760525187 +0000 UTC m=+0.093555420 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:50:58 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:50:58 localhost python3.9[267243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:50:59 localhost python3.9[267346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435458.4385748-499-260775816788321/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=401f2db3441c75ad5886350294091560f714495b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:00 localhost python3.9[267454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23490 DF PROTO=TCP SPT=40836 DPT=9102 SEQ=1628618658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B13D4DA0000000001030307) Oct 14 05:51:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:51:00 localhost python3.9[267540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435459.663786-544-118289725051775/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:00 localhost podman[267541]: 2025-10-14 09:51:00.728888846 +0000 UTC m=+0.069632768 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 05:51:00 localhost podman[267541]: 2025-10-14 09:51:00.76709231 +0000 UTC m=+0.107836252 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 05:51:00 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:51:00 localhost nova_compute[238069]: 2025-10-14 09:51:00.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:01 localhost python3.9[267667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:02 localhost python3.9[267753]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435460.8592901-544-153684285741074/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:02 localhost python3.9[267861]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:03 localhost python3.9[267916]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:03 localhost nova_compute[238069]: 2025-10-14 09:51:03.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:03 localhost python3.9[268024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:04 localhost python3.9[268110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435463.3502064-631-91343984071849/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:51:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:51:04 localhost podman[268145]: 2025-10-14 09:51:04.757861563 +0000 UTC m=+0.092316821 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc.) Oct 14 05:51:04 localhost systemd[1]: tmp-crun.LTaQwO.mount: Deactivated successfully. Oct 14 05:51:04 localhost podman[268144]: 2025-10-14 09:51:04.796556331 +0000 UTC m=+0.134896569 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:51:04 localhost podman[268145]: 2025-10-14 09:51:04.822879067 +0000 UTC m=+0.157334305 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter) Oct 14 05:51:04 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:51:04 localhost podman[268144]: 2025-10-14 09:51:04.868757188 +0000 UTC m=+0.207097346 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:51:04 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:51:05 localhost python3.9[268261]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:51:05 localhost nova_compute[238069]: 2025-10-14 09:51:05.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:06 localhost python3.9[268373]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:07 localhost python3.9[268483]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:51:07 localhost podman[268486]: 2025-10-14 09:51:07.747484157 +0000 UTC m=+0.085249602 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 05:51:07 localhost podman[268486]: 2025-10-14 09:51:07.788109355 +0000 UTC m=+0.125874780 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:51:07 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:51:08 localhost python3.9[268559]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:08 localhost nova_compute[238069]: 2025-10-14 09:51:08.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:08 localhost python3.9[268669]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:08 localhost openstack_network_exporter[250374]: ERROR 09:51:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:51:08 localhost openstack_network_exporter[250374]: ERROR 09:51:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:51:08 localhost openstack_network_exporter[250374]: ERROR 09:51:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:51:08 localhost openstack_network_exporter[250374]: ERROR 09:51:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:51:08 localhost openstack_network_exporter[250374]: Oct 14 05:51:08 localhost openstack_network_exporter[250374]: ERROR 09:51:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:51:08 localhost openstack_network_exporter[250374]: Oct 14 05:51:10 localhost python3.9[268726]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:10 localhost python3.9[268836]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:10 localhost nova_compute[238069]: 2025-10-14 09:51:10.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:11 localhost python3.9[268946]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:11 localhost python3.9[269003]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:12 localhost python3.9[269113]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:13 localhost python3.9[269170]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:13 localhost nova_compute[238069]: 2025-10-14 09:51:13.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:14 localhost python3.9[269280]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:51:14 localhost systemd[1]: Reloading. Oct 14 05:51:14 localhost systemd-rc-local-generator[269304]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:51:14 localhost systemd-sysv-generator[269310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:51:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:51:15 localhost python3.9[269427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:15 localhost python3.9[269484]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:15 localhost nova_compute[238069]: 2025-10-14 09:51:15.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:16 localhost python3.9[269594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:16 localhost python3.9[269651]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:17 localhost python3.9[269761]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:51:17 localhost systemd[1]: Reloading. Oct 14 05:51:17 localhost systemd-sysv-generator[269790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:51:17 localhost systemd-rc-local-generator[269787]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:51:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:51:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:51:18 localhost systemd[1]: Starting Create netns directory... Oct 14 05:51:18 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:51:18 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:51:18 localhost systemd[1]: Finished Create netns directory. Oct 14 05:51:18 localhost podman[269799]: 2025-10-14 09:51:18.238869927 +0000 UTC m=+0.066793300 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:51:18 localhost podman[269799]: 2025-10-14 09:51:18.275100189 +0000 UTC m=+0.103023562 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:51:18 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:51:18 localhost nova_compute[238069]: 2025-10-14 09:51:18.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:19 localhost python3.9[269933]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:51:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:51:20 localhost systemd[1]: tmp-crun.ARjKEZ.mount: Deactivated successfully. Oct 14 05:51:20 localhost podman[270044]: 2025-10-14 09:51:20.037319644 +0000 UTC m=+0.082787665 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:51:20 localhost podman[270044]: 2025-10-14 09:51:20.047336204 +0000 UTC m=+0.092804215 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:51:20 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:51:20 localhost python3.9[270043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:51:20 localhost python3.9[270154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760435479.6393862-1075-52800884096978/.source.json _original_basename=.exto0kd9 follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:20 localhost nova_compute[238069]: 2025-10-14 09:51:20.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:22 localhost python3.9[270264]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16074 DF PROTO=TCP SPT=36808 DPT=9102 SEQ=862389494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B142E350000000001030307) Oct 14 05:51:23 localhost nova_compute[238069]: 2025-10-14 09:51:23.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16075 DF PROTO=TCP SPT=36808 DPT=9102 SEQ=862389494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B14325A0000000001030307) Oct 14 05:51:25 localhost nova_compute[238069]: 2025-10-14 09:51:25.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:26 localhost python3.9[270572]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Oct 14 05:51:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16076 DF PROTO=TCP SPT=36808 DPT=9102 SEQ=862389494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B143A5A0000000001030307) Oct 14 05:51:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:51:26 localhost podman[270683]: 2025-10-14 09:51:26.977451143 +0000 UTC m=+0.079578735 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:51:26 localhost podman[270683]: 2025-10-14 09:51:26.991111916 +0000 UTC m=+0.093239548 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 05:51:27 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:51:27 localhost python3.9[270682]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:51:28 localhost python3.9[270811]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:51:28 localhost podman[248187]: time="2025-10-14T09:51:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:51:28 localhost podman[248187]: @ - - [14/Oct/2025:09:51:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 136010 "" "Go-http-client/1.1" Oct 14 05:51:28 localhost podman[248187]: @ - - [14/Oct/2025:09:51:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16921 "" "Go-http-client/1.1" Oct 14 05:51:28 localhost nova_compute[238069]: 2025-10-14 09:51:28.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:51:28 localhost podman[270892]: 2025-10-14 09:51:28.903599236 +0000 UTC m=+0.077317227 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:51:28 localhost podman[270892]: 2025-10-14 09:51:28.936767722 +0000 UTC m=+0.110485703 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:51:28 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:51:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16077 DF PROTO=TCP SPT=36808 DPT=9102 SEQ=862389494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B144A1A0000000001030307) Oct 14 05:51:30 localhost nova_compute[238069]: 2025-10-14 09:51:30.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:51:31 localhost podman[270947]: 2025-10-14 09:51:31.732518391 +0000 UTC m=+0.077962565 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Oct 14 05:51:31 localhost podman[270947]: 2025-10-14 09:51:31.744320458 +0000 UTC m=+0.089764702 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:51:31 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:51:32 localhost python3[271057]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:51:32 localhost podman[271094]: Oct 14 05:51:32 localhost podman[271094]: 2025-10-14 09:51:32.757977485 +0000 UTC m=+0.080285417 container create 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 05:51:32 localhost podman[271094]: 2025-10-14 09:51:32.708216054 +0000 UTC m=+0.030524096 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 05:51:32 localhost python3[271057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 05:51:33 localhost nova_compute[238069]: 2025-10-14 09:51:33.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:34 localhost python3.9[271258]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:51:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:51:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:51:35 localhost systemd[1]: tmp-crun.sKcNuN.mount: Deactivated successfully. Oct 14 05:51:35 localhost podman[271370]: 2025-10-14 09:51:35.460373641 +0000 UTC m=+0.078977377 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3) Oct 14 05:51:35 localhost podman[271370]: 2025-10-14 09:51:35.485339675 +0000 UTC m=+0.103943421 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 05:51:35 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:51:35 localhost podman[271372]: 2025-10-14 09:51:35.434220792 +0000 UTC m=+0.053439847 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41) Oct 14 05:51:35 localhost podman[271372]: 2025-10-14 09:51:35.564649401 +0000 UTC m=+0.183868476 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.6) Oct 14 05:51:35 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:51:35 localhost python3.9[271371]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:35 localhost nova_compute[238069]: 2025-10-14 09:51:35.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:36 localhost python3.9[271468]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:51:37 localhost python3.9[271577]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435496.691304-1339-92956763561276/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:51:37 localhost python3.9[271632]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:51:37 localhost systemd[1]: Reloading. Oct 14 05:51:38 localhost podman[271634]: 2025-10-14 09:51:38.055443113 +0000 UTC m=+0.110381590 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm) Oct 14 05:51:38 localhost systemd-rc-local-generator[271674]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:51:38 localhost systemd-sysv-generator[271679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:51:38 localhost podman[271634]: 2025-10-14 09:51:38.088009632 +0000 UTC m=+0.142948059 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Oct 14 05:51:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:51:38 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:51:38 localhost nova_compute[238069]: 2025-10-14 09:51:38.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:38 localhost openstack_network_exporter[250374]: ERROR 09:51:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:51:38 localhost openstack_network_exporter[250374]: ERROR 09:51:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:51:38 localhost openstack_network_exporter[250374]: ERROR 09:51:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:51:38 localhost openstack_network_exporter[250374]: ERROR 09:51:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:51:38 localhost openstack_network_exporter[250374]: Oct 14 05:51:38 localhost openstack_network_exporter[250374]: ERROR 09:51:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:51:38 localhost openstack_network_exporter[250374]: Oct 14 05:51:39 localhost python3.9[271740]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:51:39 localhost systemd[1]: Reloading. Oct 14 05:51:39 localhost systemd-rc-local-generator[271763]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:51:39 localhost systemd-sysv-generator[271767]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:51:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:51:40 localhost systemd[1]: Starting neutron_dhcp_agent container... Oct 14 05:51:40 localhost systemd[1]: Started libcrun container. Oct 14 05:51:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7f4064898a0807c6ba6ea01b9a7533125b8c1271a59860818c881a6ea63b44/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7f4064898a0807c6ba6ea01b9a7533125b8c1271a59860818c881a6ea63b44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:40 localhost podman[271781]: 2025-10-14 09:51:40.327599844 +0000 UTC m=+0.126612413 container init 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:51:40 localhost podman[271781]: 2025-10-14 09:51:40.337849741 +0000 UTC m=+0.136862330 container start 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=neutron_dhcp, container_name=neutron_dhcp_agent) Oct 14 05:51:40 localhost podman[271781]: neutron_dhcp_agent Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + sudo -E kolla_set_configs Oct 14 05:51:40 localhost systemd[1]: Started neutron_dhcp_agent container. Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Validating config file Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Copying service configuration files Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Writing out command to execute Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/external Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23 Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/cd0de74397aa76b626744172300028943e2372ca220b3e27b1c7d2b66ff2832c Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: ++ cat /run_command Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + CMD=/usr/bin/neutron-dhcp-agent Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + ARGS= Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + sudo kolla_copy_cacerts Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + [[ ! -n '' ]] Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + . kolla_extend_start Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: Running command: '/usr/bin/neutron-dhcp-agent' Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + umask 0022 Oct 14 05:51:40 localhost neutron_dhcp_agent[271795]: + exec /usr/bin/neutron-dhcp-agent Oct 14 05:51:40 localhost nova_compute[238069]: 2025-10-14 09:51:40.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:41 localhost nova_compute[238069]: 2025-10-14 09:51:41.020 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:41 localhost python3.9[271919]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:51:41 localhost neutron_dhcp_agent[271795]: 2025-10-14 09:51:41.696 271799 INFO neutron.common.config [-] Logging enabled!#033[00m Oct 14 05:51:41 localhost neutron_dhcp_agent[271795]: 2025-10-14 09:51:41.696 271799 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Oct 14 05:51:42 localhost neutron_dhcp_agent[271795]: 2025-10-14 09:51:42.073 271799 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Oct 14 05:51:42 localhost systemd[1]: Stopping neutron_dhcp_agent container... Oct 14 05:51:42 localhost systemd[1]: libpod-945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8.scope: Deactivated successfully. Oct 14 05:51:42 localhost systemd[1]: libpod-945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8.scope: Consumed 2.109s CPU time. Oct 14 05:51:42 localhost podman[271924]: 2025-10-14 09:51:42.905046791 +0000 UTC m=+0.446438000 container died 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Oct 14 05:51:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8-userdata-shm.mount: Deactivated successfully. Oct 14 05:51:43 localhost podman[271924]: 2025-10-14 09:51:43.015838682 +0000 UTC m=+0.557229911 container cleanup 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2) Oct 14 05:51:43 localhost podman[271924]: neutron_dhcp_agent Oct 14 05:51:43 localhost podman[271966]: error opening file `/run/crun/945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8/status`: No such file or directory Oct 14 05:51:43 localhost podman[271954]: 2025-10-14 09:51:43.124496338 +0000 UTC m=+0.074615822 container cleanup 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true) Oct 14 05:51:43 localhost podman[271954]: neutron_dhcp_agent Oct 14 05:51:43 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Oct 14 05:51:43 localhost systemd[1]: Stopped neutron_dhcp_agent container. Oct 14 05:51:43 localhost systemd[1]: Starting neutron_dhcp_agent container... Oct 14 05:51:43 localhost systemd[1]: Started libcrun container. Oct 14 05:51:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7f4064898a0807c6ba6ea01b9a7533125b8c1271a59860818c881a6ea63b44/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7f4064898a0807c6ba6ea01b9a7533125b8c1271a59860818c881a6ea63b44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:43 localhost podman[271968]: 2025-10-14 09:51:43.274487123 +0000 UTC m=+0.109881534 container init 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 05:51:43 localhost podman[271968]: 2025-10-14 09:51:43.281664776 +0000 UTC m=+0.117059227 container start 945dbb010a03f342fd36db5f1523fc333434e451d334f43332b02d97a2f8c6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b8dff637c609f29731aa082e582f8de0f51fc64353d59ddee41fa9e82c2b18d3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=neutron_dhcp, tcib_managed=true, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:51:43 localhost podman[271968]: neutron_dhcp_agent Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + sudo -E kolla_set_configs Oct 14 05:51:43 localhost systemd[1]: Started neutron_dhcp_agent container. Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Validating config file Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Copying service configuration files Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Writing out command to execute Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/external Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23 Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/cd0de74397aa76b626744172300028943e2372ca220b3e27b1c7d2b66ff2832c Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: ++ cat /run_command Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + CMD=/usr/bin/neutron-dhcp-agent Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + ARGS= Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + sudo kolla_copy_cacerts Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + [[ ! -n '' ]] Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + . kolla_extend_start Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: Running command: '/usr/bin/neutron-dhcp-agent' Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + umask 0022 Oct 14 05:51:43 localhost neutron_dhcp_agent[271983]: + exec /usr/bin/neutron-dhcp-agent Oct 14 05:51:43 localhost nova_compute[238069]: 2025-10-14 09:51:43.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:43 localhost systemd[1]: session-60.scope: Deactivated successfully. Oct 14 05:51:43 localhost systemd[1]: session-60.scope: Consumed 34.739s CPU time. Oct 14 05:51:43 localhost systemd-logind[760]: Session 60 logged out. Waiting for processes to exit. Oct 14 05:51:43 localhost systemd-logind[760]: Removed session 60. Oct 14 05:51:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:44.517 271987 INFO neutron.common.config [-] Logging enabled!#033[00m Oct 14 05:51:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:44.517 271987 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Oct 14 05:51:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:44.903 271987 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Oct 14 05:51:45 localhost nova_compute[238069]: 2025-10-14 09:51:45.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:45 localhost nova_compute[238069]: 2025-10-14 09:51:45.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:46 localhost nova_compute[238069]: 2025-10-14 09:51:46.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:46 localhost nova_compute[238069]: 2025-10-14 09:51:46.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 05:51:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:46.824 271987 INFO neutron.agent.dhcp.agent [None req-db140434-d2e0-4a45-bfca-6b968590ea0b - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 05:51:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:46.825 271987 INFO neutron.agent.dhcp.agent [-] Starting network c0145816-4627-44f2-af00-ccc9ef0436ed dhcp configuration#033[00m Oct 14 05:51:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:46.877 271987 INFO neutron.agent.dhcp.agent [-] Starting network 7d0cd696-bdd7-4e70-9512-eb0d23640314 dhcp configuration#033[00m Oct 14 05:51:47 localhost nova_compute[238069]: 2025-10-14 09:51:47.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:47.330 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:51:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:47.331 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 05:51:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:47.332 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:51:47 localhost nova_compute[238069]: 2025-10-14 09:51:47.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:47.659 271987 INFO oslo.privsep.daemon [None req-240277d7-24a3-4f27-8c2d-33fd9a0b94e2 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpegy9k_jk/privsep.sock']#033[00m Oct 14 05:51:48 localhost nova_compute[238069]: 2025-10-14 09:51:48.042 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.249 271987 INFO oslo.privsep.daemon [None req-240277d7-24a3-4f27-8c2d-33fd9a0b94e2 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.135 272020 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.141 272020 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.145 272020 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.145 272020 INFO oslo.privsep.daemon [-] privsep daemon running as pid 272020#033[00m Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.252 271987 WARNING oslo_privsep.priv_context [None req-1f72d2a1-ed46-4f5f-8246-2120d1f1191a - - - - - -] privsep daemon already running#033[00m Oct 14 05:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:51:48 localhost podman[272026]: 2025-10-14 09:51:48.717980006 +0000 UTC m=+0.063617912 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:51:48 localhost podman[272026]: 2025-10-14 09:51:48.725996654 +0000 UTC m=+0.071634630 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Oct 14 05:51:48 localhost nova_compute[238069]: 2025-10-14 09:51:48.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:48 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:51:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:48.806 271987 INFO oslo.privsep.daemon [None req-240277d7-24a3-4f27-8c2d-33fd9a0b94e2 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpu_lkpli3/privsep.sock']#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:51:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:49.377 271987 INFO oslo.privsep.daemon [None req-240277d7-24a3-4f27-8c2d-33fd9a0b94e2 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:51:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:49.281 272048 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:51:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:49.284 272048 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:51:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:49.286 272048 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Oct 14 05:51:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:49.286 272048 INFO oslo.privsep.daemon [-] privsep daemon running as pid 272048#033[00m Oct 14 05:51:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:49.381 271987 WARNING oslo_privsep.priv_context [None req-1f72d2a1-ed46-4f5f-8246-2120d1f1191a - - - - - -] privsep daemon already running#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.700 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.701 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.701 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:51:49 localhost nova_compute[238069]: 2025-10-14 09:51:49.702 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:51:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.284 271987 INFO oslo.privsep.daemon [None req-1f72d2a1-ed46-4f5f-8246-2120d1f1191a - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpcijdrle9/privsep.sock']#033[00m Oct 14 05:51:50 localhost podman[272060]: 2025-10-14 09:51:50.38234684 +0000 UTC m=+0.086903013 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:51:50 localhost podman[272060]: 2025-10-14 09:51:50.412949488 +0000 UTC m=+0.117505631 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:51:50 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.840 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.860 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.861 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.861 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.861 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.862 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.896 271987 INFO oslo.privsep.daemon [None req-1f72d2a1-ed46-4f5f-8246-2120d1f1191a - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.806 272087 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.810 272087 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.813 272087 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.813 272087 INFO oslo.privsep.daemon [-] privsep daemon running as pid 272087#033[00m Oct 14 05:51:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:50.901 271987 WARNING oslo_privsep.priv_context [None req-240277d7-24a3-4f27-8c2d-33fd9a0b94e2 - - - - - -] privsep daemon already running#033[00m Oct 14 05:51:50 localhost nova_compute[238069]: 2025-10-14 09:51:50.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:51 localhost nova_compute[238069]: 2025-10-14 09:51:51.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:51 localhost nova_compute[238069]: 2025-10-14 09:51:51.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:51 localhost nova_compute[238069]: 2025-10-14 09:51:51.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 05:51:51 localhost nova_compute[238069]: 2025-10-14 09:51:51.044 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.044 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.045 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:51:52 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:52.409 271987 INFO neutron.agent.linux.ip_lib [None req-1f72d2a1-ed46-4f5f-8246-2120d1f1191a - - - - - -] Device tapa4b30293-43 cannot be used as it has no MAC address#033[00m Oct 14 05:51:52 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:52.410 271987 INFO neutron.agent.linux.ip_lib [None req-240277d7-24a3-4f27-8c2d-33fd9a0b94e2 - - - - - -] Device tap3e87d18d-bf cannot be used as it has no MAC address#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost kernel: device tapa4b30293-43 entered promiscuous mode Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00047|binding|INFO|Claiming lport a4b30293-434d-4d8b-b6ad-840c82777955 for this chassis. Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00048|binding|INFO|a4b30293-434d-4d8b-b6ad-840c82777955: Claiming unknown Oct 14 05:51:52 localhost NetworkManager[5977]: [1760435512.5608] manager: (tapa4b30293-43): new Generic device (/org/freedesktop/NetworkManager/Devices/15) Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost systemd-udevd[272111]: Network interface NamePolicy= disabled on kernel command line. Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00049|binding|INFO|Setting lport a4b30293-434d-4d8b-b6ad-840c82777955 ovn-installed in OVS Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00050|binding|INFO|Setting lport a4b30293-434d-4d8b-b6ad-840c82777955 up in Southbound Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.574 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-c0145816-4627-44f2-af00-ccc9ef0436ed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0145816-4627-44f2-af00-ccc9ef0436ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4a79b2d-2081-4037-8963-a49d853ec2ea, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=a4b30293-434d-4d8b-b6ad-840c82777955) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.577 163055 INFO neutron.agent.ovn.metadata.agent [-] Port a4b30293-434d-4d8b-b6ad-840c82777955 in datapath c0145816-4627-44f2-af00-ccc9ef0436ed bound to our chassis#033[00m Oct 14 05:51:52 localhost kernel: device tap3e87d18d-bf entered promiscuous mode Oct 14 05:51:52 localhost NetworkManager[5977]: [1760435512.5823] manager: (tap3e87d18d-bf): new Generic device (/org/freedesktop/NetworkManager/Devices/16) Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.581 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 87ba2fb9-204a-489c-af46-1632ef587df4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.581 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0145816-4627-44f2-af00-ccc9ef0436ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00051|binding|INFO|Claiming lport 3e87d18d-bfa7-4c40-a7bd-121db3f2a44a for this chassis. Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00052|binding|INFO|3e87d18d-bfa7-4c40-a7bd-121db3f2a44a: Claiming unknown Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.587 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[aed54398-d1db-4332-aeeb-641384bf55c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.595 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.3/24', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d31a249-7ee5-4da6-a9d1-dab19bbf097c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=3e87d18d-bfa7-4c40-a7bd-121db3f2a44a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:51:52 localhost systemd-udevd[272114]: Network interface NamePolicy= disabled on kernel command line. Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.600 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3e87d18d-bfa7-4c40-a7bd-121db3f2a44a in datapath 7d0cd696-bdd7-4e70-9512-eb0d23640314 bound to our chassis#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.603 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 85b3ca3e-8aac-4a2f-8ce5-3542f4a390a8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.604 163055 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d0cd696-bdd7-4e70-9512-eb0d23640314#033[00m Oct 14 05:51:52 localhost journal[237477]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, ) Oct 14 05:51:52 localhost journal[237477]: hostname: np0005486733.localdomain Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00053|binding|INFO|Setting lport 3e87d18d-bfa7-4c40-a7bd-121db3f2a44a ovn-installed in OVS Oct 14 05:51:52 localhost ovn_controller[157396]: 2025-10-14T09:51:52Z|00054|binding|INFO|Setting lport 3e87d18d-bfa7-4c40-a7bd-121db3f2a44a up in Southbound Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.634 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[103073d4-8015-4998-88da-40929396de06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost journal[237477]: ethtool ioctl error on tapa4b30293-43: No such device Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.671 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac647e6-bf4c-4a48-897b-cda637c3a72f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.677 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[e20451db-cfc9-4d48-b923-86c449d8e7c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.710 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[c5372f39-c1aa-475e-94e6-ec792ddf1d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.728 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[cb827f81-2c87-44b5-abcf-c43ede644729]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d0cd696-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:7e:3c:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 73, 'rx_bytes': 8926, 'tx_bytes': 7516, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 73, 'rx_bytes': 8926, 'tx_bytes': 7516, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705865, 'reachable_time': 24254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 20, 'outoctets': 1412, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 20, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1412, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 20, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272150, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.746 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[402a361a-7f4c-4e3b-882f-5aa450bdcaa8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d0cd696-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705872, 'tstamp': 705872}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272153, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap7d0cd696-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705876, 'tstamp': 705876}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272153, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.748 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d0cd696-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:51:52 localhost nova_compute[238069]: 2025-10-14 09:51:52.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.753 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d0cd696-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.754 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.755 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d0cd696-b0, col_values=(('external_ids', {'iface-id': '25c6586a-239c-451b-aac2-e0a3ee5c3145'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:51:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:52.755 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.052 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.053 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.053 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.054 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.055 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:51:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42447 DF PROTO=TCP SPT=54474 DPT=9102 SEQ=1965947629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B14A3650000000001030307) Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.509 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.579 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.579 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:51:53 localhost podman[272258]: Oct 14 05:51:53 localhost podman[272258]: 2025-10-14 09:51:53.697268879 +0000 UTC m=+0.129716018 container create dc7995a96eb33a899d0ea27196ac30e618af75f5efc17eb05fba799996c3a8d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d0cd696-bdd7-4e70-9512-eb0d23640314, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:51:53 localhost podman[272278]: Oct 14 05:51:53 localhost podman[272278]: 2025-10-14 09:51:53.749355613 +0000 UTC m=+0.111221476 container create 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:51:53 localhost podman[272258]: 2025-10-14 09:51:53.661806651 +0000 UTC m=+0.094253820 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:53 localhost systemd[1]: Started libpod-conmon-dc7995a96eb33a899d0ea27196ac30e618af75f5efc17eb05fba799996c3a8d9.scope. Oct 14 05:51:53 localhost systemd[1]: Started libpod-conmon-373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964.scope. Oct 14 05:51:53 localhost systemd[1]: Started libcrun container. Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.790 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.792 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11733MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.792 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:51:53 localhost nova_compute[238069]: 2025-10-14 09:51:53.792 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:51:53 localhost podman[272278]: 2025-10-14 09:51:53.698453416 +0000 UTC m=+0.060319369 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 05:51:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b341f4353a3eb315605769a701c701f5f97c9cd62469f940ac96dc81deed34eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:53 localhost systemd[1]: Started libcrun container. Oct 14 05:51:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2a8e08ec2da19ae228015c891a79ab4164456cccb64d0ef1eafa1149b2a4dca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:53 localhost podman[272258]: 2025-10-14 09:51:53.809303639 +0000 UTC m=+0.241750758 container init dc7995a96eb33a899d0ea27196ac30e618af75f5efc17eb05fba799996c3a8d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d0cd696-bdd7-4e70-9512-eb0d23640314, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:51:53 localhost podman[272278]: 2025-10-14 09:51:53.814908523 +0000 UTC m=+0.176774376 container init 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:51:53 localhost podman[272278]: 2025-10-14 09:51:53.823323264 +0000 UTC m=+0.185189127 container start 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 05:51:53 localhost dnsmasq[272302]: started, version 2.85 cachesize 150 Oct 14 05:51:53 localhost dnsmasq[272302]: DNS service limited to local subnets Oct 14 05:51:53 localhost dnsmasq[272302]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 05:51:53 localhost dnsmasq[272302]: warning: no upstream servers configured Oct 14 05:51:53 localhost dnsmasq-dhcp[272302]: DHCP, static leases only on 192.168.0.0, lease time 1d Oct 14 05:51:53 localhost dnsmasq[272302]: read /var/lib/neutron/dhcp/7d0cd696-bdd7-4e70-9512-eb0d23640314/addn_hosts - 2 addresses Oct 14 05:51:53 localhost dnsmasq-dhcp[272302]: read /var/lib/neutron/dhcp/7d0cd696-bdd7-4e70-9512-eb0d23640314/host Oct 14 05:51:53 localhost dnsmasq-dhcp[272302]: read /var/lib/neutron/dhcp/7d0cd696-bdd7-4e70-9512-eb0d23640314/opts Oct 14 05:51:53 localhost dnsmasq[272303]: started, version 2.85 cachesize 150 Oct 14 05:51:53 localhost dnsmasq[272303]: DNS service limited to local subnets Oct 14 05:51:53 localhost dnsmasq[272303]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 05:51:53 localhost dnsmasq[272303]: warning: no upstream servers configured Oct 14 05:51:53 localhost dnsmasq-dhcp[272303]: DHCP, static leases only on 192.168.122.0, lease time 1d Oct 14 05:51:53 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 05:51:53 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 05:51:53 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 05:51:53 localhost podman[272258]: 2025-10-14 09:51:53.865969634 +0000 UTC m=+0.298416773 container start dc7995a96eb33a899d0ea27196ac30e618af75f5efc17eb05fba799996c3a8d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d0cd696-bdd7-4e70-9512-eb0d23640314, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:51:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:53.872 271987 INFO neutron.agent.dhcp.agent [None req-0d26d90a-8516-408f-8964-0dfc74130a57 - - - - - -] Finished network c0145816-4627-44f2-af00-ccc9ef0436ed dhcp configuration#033[00m Oct 14 05:51:54 localhost podman[272321]: 2025-10-14 09:51:54.079366764 +0000 UTC m=+0.063873558 container kill 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, version=17.1.9) Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.082 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.082 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.082 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:51:54 localhost systemd[1]: libpod-21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df.scope: Deactivated successfully. Oct 14 05:51:54 localhost podman[272337]: 2025-10-14 09:51:54.164466651 +0000 UTC m=+0.060091463 container died 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64) Oct 14 05:51:54 localhost podman[272337]: 2025-10-14 09:51:54.298762951 +0000 UTC m=+0.194387713 container cleanup 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, release=1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:51:54 localhost systemd[1]: libpod-conmon-21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df.scope: Deactivated successfully. Oct 14 05:51:54 localhost podman[272336]: 2025-10-14 09:51:54.319294516 +0000 UTC m=+0.214547967 container remove 21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, batch=17.1_20250721.1) Oct 14 05:51:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:54.329 271987 INFO neutron.agent.dhcp.agent [None req-bcad1db8-1751-4586-a076-496a92f07029 - - - - - -] Finished network 7d0cd696-bdd7-4e70-9512-eb0d23640314 dhcp configuration#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.329 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 05:51:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:54.329 271987 INFO neutron.agent.dhcp.agent [None req-db140434-d2e0-4a45-bfca-6b968590ea0b - - - - - -] Synchronizing state complete#033[00m Oct 14 05:51:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:54.398 271987 INFO neutron.agent.dhcp.agent [None req-db140434-d2e0-4a45-bfca-6b968590ea0b - - - - - -] DHCP agent started#033[00m Oct 14 05:51:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42448 DF PROTO=TCP SPT=54474 DPT=9102 SEQ=1965947629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B14A75B0000000001030307) Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.568 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.569 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.598 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.625 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_ACCELERATORS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 05:51:54 localhost nova_compute[238069]: 2025-10-14 09:51:54.683 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:51:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:54.695 271987 INFO neutron.agent.dhcp.agent [None req-93084be3-f6a1-4cad-b4ac-a7795b09b301 - - - - - -] DHCP configuration for ports {'62f47f8a-76e6-4e1f-aab4-b3ec4b9f5cf9', 'c5061e05-fbdf-4d81-b1d8-4bfaaa73263c', 'c8a1e507-d02b-46f2-ba97-01ab899e151c'} is completed#033[00m Oct 14 05:51:54 localhost systemd[1]: tmp-crun.97LQHd.mount: Deactivated successfully. Oct 14 05:51:54 localhost systemd[1]: var-lib-containers-storage-overlay-796a4e695fd07aa1ca28abde2b1fa6fa4df60d3155a5cda32e2dcba101be602d-merged.mount: Deactivated successfully. Oct 14 05:51:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21da58e6954cfab73a5a811ca85c2c1bedc94e2e2a6445d96b7ae4c16b2e98df-userdata-shm.mount: Deactivated successfully. Oct 14 05:51:55 localhost nova_compute[238069]: 2025-10-14 09:51:55.131 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:51:55 localhost nova_compute[238069]: 2025-10-14 09:51:55.137 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:51:55 localhost nova_compute[238069]: 2025-10-14 09:51:55.173 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:51:55 localhost nova_compute[238069]: 2025-10-14 09:51:55.174 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:51:55 localhost nova_compute[238069]: 2025-10-14 09:51:55.174 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:51:55 localhost neutron_dhcp_agent[271983]: 2025-10-14 09:51:55.707 271987 INFO neutron.agent.dhcp.agent [None req-b131411d-0312-4f31-b659-66133b1363e2 - - - - - -] DHCP configuration for ports {'3ec9b060-f43d-4698-9c76-6062c70911d5', '25c6586a-239c-451b-aac2-e0a3ee5c3145', 'c5061e05-fbdf-4d81-b1d8-4bfaaa73263c', 'c8a1e507-d02b-46f2-ba97-01ab899e151c', '62f47f8a-76e6-4e1f-aab4-b3ec4b9f5cf9', '4903d7fa-d866-49ab-b620-a59d3bb57acf'} is completed#033[00m Oct 14 05:51:55 localhost nova_compute[238069]: 2025-10-14 09:51:55.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42449 DF PROTO=TCP SPT=54474 DPT=9102 SEQ=1965947629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B14AF5A0000000001030307) Oct 14 05:51:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:51:57 localhost podman[272385]: 2025-10-14 09:51:57.743043587 +0000 UTC m=+0.077950796 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.758 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.759 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.759 163055 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.760 163055 ERROR neutron.agent.linux.external_process [-] metadata-proxy for metadata with uuid 7d0cd696-bdd7-4e70-9512-eb0d23640314 not found. The process should not have died#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.760 163055 WARNING neutron.agent.linux.external_process [-] Respawning metadata-proxy for uuid 7d0cd696-bdd7-4e70-9512-eb0d23640314#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.760 163055 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.761 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[b444da28-e7da-42a0-ab84-52c9bdd01084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.762 163055 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: global Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: log /dev/log local0 debug Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: log-tag haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314 Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: user root Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: group root Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: maxconn 1024 Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: pidfile /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: daemon Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: defaults Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: log global Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: mode http Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: option httplog Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: option dontlognull Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: option http-server-close Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: option forwardfor Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: retries 3 Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: timeout http-request 30s Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: timeout connect 30s Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: timeout client 32s Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: timeout server 32s Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: timeout http-keep-alive 30s Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: listen listener Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: bind 169.254.169.254:80 Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: server metadata /var/lib/neutron/metadata_proxy Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: http-request add-header X-OVN-Network-ID 7d0cd696-bdd7-4e70-9512-eb0d23640314 Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Oct 14 05:51:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:57.763 163055 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'env', 'PROCESS_TAG=haproxy-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Oct 14 05:51:57 localhost podman[272385]: 2025-10-14 09:51:57.775795321 +0000 UTC m=+0.110702580 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:51:57 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:51:58 localhost podman[272431]: Oct 14 05:51:58 localhost podman[272431]: 2025-10-14 09:51:58.201106465 +0000 UTC m=+0.095067935 container create 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 05:51:58 localhost systemd[1]: Started libpod-conmon-4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9.scope. Oct 14 05:51:58 localhost podman[272431]: 2025-10-14 09:51:58.156923477 +0000 UTC m=+0.050885007 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Oct 14 05:51:58 localhost systemd[1]: Started libcrun container. Oct 14 05:51:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f85672eda8adebaf08be6b9a7103250b5ec8f68390658a7a5c3d072f58e6ea08/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:51:58 localhost podman[272431]: 2025-10-14 09:51:58.278729439 +0000 UTC m=+0.172690939 container init 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:51:58 localhost podman[272431]: 2025-10-14 09:51:58.288307516 +0000 UTC m=+0.182269026 container start 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 05:51:58 localhost podman[248187]: time="2025-10-14T09:51:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:51:58 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [NOTICE] (272449) : New worker (272451) forked Oct 14 05:51:58 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [NOTICE] (272449) : Loading success. Oct 14 05:51:58 localhost podman[248187]: @ - - [14/Oct/2025:09:51:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141160 "" "Go-http-client/1.1" Oct 14 05:51:58 localhost ovn_metadata_agent[163050]: 2025-10-14 09:51:58.353 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:51:58 localhost podman[248187]: @ - - [14/Oct/2025:09:51:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18311 "" "Go-http-client/1.1" Oct 14 05:51:58 localhost nova_compute[238069]: 2025-10-14 09:51:58.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:51:59 localhost podman[272460]: 2025-10-14 09:51:59.74864562 +0000 UTC m=+0.087705197 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:51:59 localhost podman[272460]: 2025-10-14 09:51:59.782270542 +0000 UTC m=+0.121330119 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:51:59 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:52:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42450 DF PROTO=TCP SPT=54474 DPT=9102 SEQ=1965947629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B14BF1A0000000001030307) Oct 14 05:52:00 localhost nova_compute[238069]: 2025-10-14 09:52:00.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:52:02 localhost podman[272485]: 2025-10-14 09:52:02.749100658 +0000 UTC m=+0.084221269 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:52:02 localhost podman[272485]: 2025-10-14 09:52:02.790205382 +0000 UTC m=+0.125325923 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:52:02 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:52:03 localhost nova_compute[238069]: 2025-10-14 09:52:03.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:52:05 localhost systemd[1]: tmp-crun.hiMyST.mount: Deactivated successfully. Oct 14 05:52:05 localhost podman[272504]: 2025-10-14 09:52:05.743097857 +0000 UTC m=+0.086437299 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller) Oct 14 05:52:05 localhost podman[272505]: 2025-10-14 09:52:05.7841765 +0000 UTC m=+0.124124307 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41) Oct 14 05:52:05 localhost podman[272505]: 2025-10-14 09:52:05.799117082 +0000 UTC m=+0.139064899 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9) Oct 14 05:52:05 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:52:05 localhost podman[272504]: 2025-10-14 09:52:05.854966882 +0000 UTC m=+0.198306284 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:52:05 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:52:05 localhost nova_compute[238069]: 2025-10-14 09:52:05.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:06 localhost nova_compute[238069]: 2025-10-14 09:52:06.561 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:52:08 localhost systemd[1]: tmp-crun.LQjMZE.mount: Deactivated successfully. Oct 14 05:52:08 localhost podman[272548]: 2025-10-14 09:52:08.745722402 +0000 UTC m=+0.091300139 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS) Oct 14 05:52:08 localhost podman[272548]: 2025-10-14 09:52:08.755611688 +0000 UTC m=+0.101189405 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:52:08 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:52:08 localhost openstack_network_exporter[250374]: ERROR 09:52:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:52:08 localhost openstack_network_exporter[250374]: ERROR 09:52:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:52:08 localhost openstack_network_exporter[250374]: ERROR 09:52:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:52:08 localhost openstack_network_exporter[250374]: ERROR 09:52:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:52:08 localhost openstack_network_exporter[250374]: Oct 14 05:52:08 localhost openstack_network_exporter[250374]: ERROR 09:52:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:52:08 localhost openstack_network_exporter[250374]: Oct 14 05:52:08 localhost nova_compute[238069]: 2025-10-14 09:52:08.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:09 localhost nova_compute[238069]: 2025-10-14 09:52:09.817 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 05:52:09 localhost nova_compute[238069]: 2025-10-14 09:52:09.818 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:52:09 localhost nova_compute[238069]: 2025-10-14 09:52:09.819 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:52:09 localhost nova_compute[238069]: 2025-10-14 09:52:09.864 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:52:10 localhost nova_compute[238069]: 2025-10-14 09:52:10.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:13 localhost nova_compute[238069]: 2025-10-14 09:52:13.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:15 localhost nova_compute[238069]: 2025-10-14 09:52:15.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:18 localhost nova_compute[238069]: 2025-10-14 09:52:18.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:52:19 localhost podman[272566]: 2025-10-14 09:52:19.74138609 +0000 UTC m=+0.085678398 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 05:52:19 localhost podman[272566]: 2025-10-14 09:52:19.750570001 +0000 UTC m=+0.094862339 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:52:19 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:52:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:52:20 localhost podman[272584]: 2025-10-14 09:52:20.736210752 +0000 UTC m=+0.073428622 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:52:20 localhost podman[272584]: 2025-10-14 09:52:20.772038151 +0000 UTC m=+0.109255961 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:52:20 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:52:20 localhost nova_compute[238069]: 2025-10-14 09:52:20.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:22 localhost ovn_controller[157396]: 2025-10-14T09:52:22Z|00055|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory Oct 14 05:52:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24800 DF PROTO=TCP SPT=51084 DPT=9102 SEQ=32477701 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1518950000000001030307) Oct 14 05:52:23 localhost nova_compute[238069]: 2025-10-14 09:52:23.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24801 DF PROTO=TCP SPT=51084 DPT=9102 SEQ=32477701 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B151C9A0000000001030307) Oct 14 05:52:25 localhost nova_compute[238069]: 2025-10-14 09:52:25.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24802 DF PROTO=TCP SPT=51084 DPT=9102 SEQ=32477701 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B15249A0000000001030307) Oct 14 05:52:28 localhost podman[248187]: time="2025-10-14T09:52:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:52:28 localhost podman[248187]: @ - - [14/Oct/2025:09:52:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141160 "" "Go-http-client/1.1" Oct 14 05:52:28 localhost podman[248187]: @ - - [14/Oct/2025:09:52:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18325 "" "Go-http-client/1.1" Oct 14 05:52:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:52:28 localhost podman[272607]: 2025-10-14 09:52:28.725338459 +0000 UTC m=+0.067349915 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:52:28 localhost podman[272607]: 2025-10-14 09:52:28.762259621 +0000 UTC m=+0.104270917 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, container_name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:52:28 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:52:28 localhost nova_compute[238069]: 2025-10-14 09:52:28.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24803 DF PROTO=TCP SPT=51084 DPT=9102 SEQ=32477701 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B15345B0000000001030307) Oct 14 05:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:52:30 localhost podman[272626]: 2025-10-14 09:52:30.735000717 +0000 UTC m=+0.077681823 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:52:30 localhost podman[272626]: 2025-10-14 09:52:30.771262879 +0000 UTC m=+0.113943965 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:52:30 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:52:30 localhost nova_compute[238069]: 2025-10-14 09:52:30.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:52:33 localhost podman[272666]: 2025-10-14 09:52:33.471218643 +0000 UTC m=+0.094977893 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 05:52:33 localhost podman[272666]: 2025-10-14 09:52:33.492029191 +0000 UTC m=+0.115788451 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 05:52:33 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:52:34 localhost nova_compute[238069]: 2025-10-14 09:52:34.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:34 localhost sshd[272752]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:52:34 localhost systemd-logind[760]: New session 61 of user zuul. Oct 14 05:52:34 localhost systemd[1]: Started Session 61 of User zuul. Oct 14 05:52:35 localhost python3.9[272901]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:52:35 localhost nova_compute[238069]: 2025-10-14 09:52:35.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:52:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:52:36 localhost podman[272978]: 2025-10-14 09:52:36.770937597 +0000 UTC m=+0.100144802 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 05:52:36 localhost podman[272977]: 2025-10-14 09:52:36.819868376 +0000 UTC m=+0.149702400 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:52:36 localhost podman[272978]: 2025-10-14 09:52:36.835520757 +0000 UTC m=+0.164727962 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public) Oct 14 05:52:36 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:52:36 localhost podman[272977]: 2025-10-14 09:52:36.860595886 +0000 UTC m=+0.190429910 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:52:36 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:52:37 localhost python3.9[273053]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:37 localhost python3.9[273170]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:38 localhost python3.9[273298]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:38 localhost openstack_network_exporter[250374]: ERROR 09:52:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:52:38 localhost openstack_network_exporter[250374]: ERROR 09:52:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:52:38 localhost openstack_network_exporter[250374]: ERROR 09:52:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:52:38 localhost openstack_network_exporter[250374]: ERROR 09:52:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:52:38 localhost openstack_network_exporter[250374]: Oct 14 05:52:38 localhost openstack_network_exporter[250374]: ERROR 09:52:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:52:38 localhost openstack_network_exporter[250374]: Oct 14 05:52:38 localhost python3.9[273408]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:52:39 localhost nova_compute[238069]: 2025-10-14 09:52:39.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:52:39 localhost podman[273518]: 2025-10-14 09:52:39.388341019 +0000 UTC m=+0.090548387 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute) Oct 14 05:52:39 localhost podman[273518]: 2025-10-14 09:52:39.400984727 +0000 UTC m=+0.103192075 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:52:39 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:52:39 localhost python3.9[273519]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:40 localhost python3.9[273647]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:52:40 localhost nova_compute[238069]: 2025-10-14 09:52:40.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:52:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4970 writes, 22K keys, 4970 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4970 writes, 654 syncs, 7.60 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10 writes, 20 keys, 10 commit groups, 1.0 writes per commit group, ingest: 0.01 MB, 0.00 MB/s#012Interval WAL: 10 writes, 5 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:52:41 localhost python3.9[273759]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:52:42 localhost python3.9[273871]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:52:42 localhost network[273888]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:52:42 localhost network[273889]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:52:42 localhost network[273890]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:52:44 localhost nova_compute[238069]: 2025-10-14 09:52:44.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 05:52:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5561 writes, 24K keys, 5561 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5561 writes, 768 syncs, 7.24 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10 writes, 20 keys, 10 commit groups, 1.0 writes per commit group, ingest: 0.01 MB, 0.00 MB/s#012Interval WAL: 10 writes, 5 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 05:52:45 localhost nova_compute[238069]: 2025-10-14 09:52:45.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:49 localhost nova_compute[238069]: 2025-10-14 09:52:49.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:49 localhost nova_compute[238069]: 2025-10-14 09:52:49.282 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:49 localhost nova_compute[238069]: 2025-10-14 09:52:49.282 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:49 localhost python3.9[274123]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.814 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.815 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.820 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1aa9783-4924-4eea-8582-c9d34ab81e20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.815877', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b655dc4-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': 'f7278f55d644492cc414390ca7a17da89425f8acb40e82560b35bf2f38659df9'}]}, 'timestamp': '2025-10-14 09:52:49.820742', '_unique_id': 'd2f678e4dd0a436baddc1b3ffde1f645'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.822 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.823 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.823 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69d1772-464e-4b28-948f-983b7edd7dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.823538', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b65e2bc-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': '0f8ff2ae7f5ed718237a44a773d2c9dbdb009e6e3040641fb53bd441ca3afa90'}]}, 'timestamp': '2025-10-14 09:52:49.824064', '_unique_id': '18f7eff4469a4e81b0f80ad84faf7176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.825 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.851 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.852 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5f5dc74-88c5-4a76-9c58-211e1045ee83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.826551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b6a3592-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '0839f3ce732456487754fe7447c1e3bfe1e69ba6499cc6ce81090fa8a20d8ba0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.826551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b6a41e0-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': 'f94aab643e845d8a02beb904b30c94ed71b46c89d8e502d456c5549950a18074'}]}, 'timestamp': '2025-10-14 09:52:49.852609', '_unique_id': '8bb0c74754d649adb953c8445c7706d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.854 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.854 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a782675-38c1-427b-b736-4bfd41932a2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.854322', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b6a90e6-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': 'd9d4b62d2f75fe9fc31f24032cd82346b2f7930c2be64b243107187b9de69fd4'}]}, 'timestamp': '2025-10-14 09:52:49.854653', '_unique_id': '072b11b1ff55422c86185f1a026c7e56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.856 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f221c08f-6cb8-47bf-9b42-cfdd000f5c78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.856179', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b6ad916-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': '62a1a44ca2d5a8ebae3d3cbe53ec05952065dd91d5f3a656345a1b6c895b8609'}]}, 'timestamp': '2025-10-14 09:52:49.856499', '_unique_id': '0bd29a5718c645b0a5cbd80d3e032d68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.858 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98cae376-dfb3-4598-a82e-e58b9a437e15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.858048', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b6b222c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': 'e0c9d8fccb7b40d7a03e55eddc1905a60b0770501e5cf4672c642a139fd00e04'}]}, 'timestamp': '2025-10-14 09:52:49.858369', '_unique_id': '9e30d54d319a4a9a87176cecc2a28e1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.859 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 9226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dad20425-af23-4c50-969f-a7623e1e665d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9226, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.859906', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b6b6ac0-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': 'ed6a7e4c9fdfde1730c80722b1cf3ea7f34fe97f77655906729a0432bb13e17b'}]}, 'timestamp': '2025-10-14 09:52:49.860226', '_unique_id': '0fda84e2dac140d090d9906dfa2279a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92ad0c65-b5b0-4463-a060-ce99f82c7d46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.861657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b6e6bb2-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.0543715, 'message_signature': '0b075ff76cf12b8e25832c8365e102c04724f3091c5400aa2580dfdc0e8fb781'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.861657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b6e7c38-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.0543715, 'message_signature': '2f765a8a3160bcc3f55f8c11f8c3591e456bd8e58efc8b81466607a2ccfa8273'}]}, 'timestamp': '2025-10-14 09:52:49.880376', '_unique_id': '517e67ba397747cc86b68913317c5e63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.882 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c61ebd92-3786-42f5-9dd2-2a9d3177fcd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:52:49.882565', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8b72bb40-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.10005413, 'message_signature': '0631bcafe3571f91a4f5ffed9dfbba39f51dd3f684d186650c1fe8f9163bfd9a'}]}, 'timestamp': '2025-10-14 09:52:49.908171', '_unique_id': 'fd022c0085394e91a285c34244800d16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db7a6a5e-b1aa-4e08-8115-16a732840262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.909793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b7307f8-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.0543715, 'message_signature': '3fc617ee6cfaa9a6a6870cf392f8f8a584a30dafec75d4e90cc45e494ce430ff'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.909793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b730fbe-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.0543715, 'message_signature': '51b64e0fb70df61aec16cacee699b15179903375324c0571b9c6146eb5de5bb1'}]}, 'timestamp': '2025-10-14 09:52:49.910256', '_unique_id': '9e5c106c56da4481a06c219f013cc2bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdcfd80f-f92e-4edc-95fd-409899dc7bc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.911485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b734808-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.0543715, 'message_signature': '2336f9ef27f64e5c304d347d47d1fcb38a45a3ad8efa1fc851e5868f2759f6da'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.911485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b7351d6-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.0543715, 'message_signature': 'd884c469b7c287fb0360309859755cfa91b1ab4fec8d94d2ce8fbc8383c8f273'}]}, 'timestamp': '2025-10-14 09:52:49.911997', '_unique_id': '892d0cb84f204fefab90c8e190a20c62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.913 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.913 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367dd3ff-8162-4055-898e-1d71fcda6bfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.913086', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b738638-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '423636f3929f006ad3850058f668aa82a03d47637dd525fbd0ed1e3844405aed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.913086', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b738e9e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': 'cada3335e6f19bf47aeb9bf6bd78131fdb38170766237a51f5da0a3280a72169'}]}, 'timestamp': '2025-10-14 09:52:49.913523', '_unique_id': '2ee862ae852a414ca4ac9467c19bc290'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2474034-a110-45be-80d6-fbebb475adaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.914596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b73c1ca-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '40cdfcb53da931c4fef790acc6d3b58d2a6d5773753108107af96e9a84b56ffc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.914596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b73c936-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': 'd37957d59d0a7b61ac23dc0f33041e071c615491e923e1cd056a131d64b9a85e'}]}, 'timestamp': '2025-10-14 09:52:49.915069', '_unique_id': 'f91cb5a0a2fd4bc4a9d6c92d165096aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '076f6ad2-dde0-4b17-8996-fb7d57a11048', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.916170', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b73ff00-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': '14f84167a39b3766ff80e58cece7085513e678dbb14ca983c306733effdc11a1'}]}, 'timestamp': '2025-10-14 09:52:49.916398', '_unique_id': '952d7aeb0ba341389d1f623893228636'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.917 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 64400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f0b3c25-a153-4076-826e-bafed64e221e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 64400000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:52:49.917532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8b7433c6-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.10005413, 'message_signature': 'd546a4e646858f70cbfb546d7a14cb7d4813a64c89f7cd44f76a14a4845ae1d5'}]}, 'timestamp': '2025-10-14 09:52:49.917763', '_unique_id': 'cc9fc7516c154c45bbacf327c283587a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e053d07-0a33-4376-8343-567b259e38c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.918875', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b7468dc-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': 'd1aee055947db2feff8cf2b19b3ef506d9fe6cbba6649e71bd22849e3bcf1574'}]}, 'timestamp': '2025-10-14 09:52:49.919187', '_unique_id': '9b3a767b98814efa9246f35914f1d659'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3cce1e5-b2b5-483c-b53e-5c509ce93aab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.920346', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b74a19e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '582757aeb590bbbc9857b03afd28e0bfb0ba6427a72d9a36027498d14dd2a3e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.920346', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b74ac0c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '5fc1322c7ef02f8aef38d18845e955cc02387561878fa2e1714916c30be9d18c'}]}, 'timestamp': '2025-10-14 09:52:49.920823', '_unique_id': 'c187ca4639684ca7b89c6652477bd495'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f48045e-f41f-4d99-ad81-1ea8e9d4d5ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.921929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b74df9c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '38a76b9d80747f376b1a449079666486523c00432c71f0875000c5319079fb03'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.921929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b74e8de-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '16099d1654abee6856b0af000e9fa22c1b8c43760bcad108d173750d3d6251bc'}]}, 'timestamp': '2025-10-14 09:52:49.922370', '_unique_id': 'dbcf39b9ed46406eafe6cd895833f6dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 530 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b802706e-0b26-4de4-beb0-2d691451a025', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 530, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.923492', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b751d40-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': '76bc6041fc7ae7e82cf9358d0571b43653ab8c8ade0c2d322fec67bc2fe3b26a'}]}, 'timestamp': '2025-10-14 09:52:49.923779', '_unique_id': '65abfbcb55614c0eb9ef4951a7ff1134'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ac109a1-c9f6-4c91-9e30-a64f627c72b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:52:49.925222', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8b756340-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.008594075, 'message_signature': 'de08ff0ae9b3219652b62cfadf04ad2b5f553f4a7a97aef8fe8b6795bf8ddeec'}]}, 'timestamp': '2025-10-14 09:52:49.925535', '_unique_id': '28d65ccd759345eab5d72fcebdf89d40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d4dfba2-5ecb-4b9c-a7a3-484d2d03e2d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:52:49.926613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b7597ac-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '6cc52a1a21effceb0a2302179462796ab5b0ec33d1af951ca12cb6ff8563da57'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:52:49.926613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b75a1c0-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11586.019262223, 'message_signature': '61cdf26561827c45a6d243c5ddb7eaef64e0c034eb6665730c6f65161e7412d8'}]}, 'timestamp': '2025-10-14 09:52:49.927123', '_unique_id': '8e5d8a2a2e1b49d5a41073562ba026af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:52:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:52:49.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:52:50 localhost python3.9[274233]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.716 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.717 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.717 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.717 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:52:50 localhost podman[274291]: 2025-10-14 09:52:50.735426464 +0000 UTC m=+0.077550509 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 05:52:50 localhost podman[274291]: 2025-10-14 09:52:50.772312225 +0000 UTC m=+0.114436270 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:52:50 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:52:50 localhost systemd[1]: tmp-crun.13CiPW.mount: Deactivated successfully. Oct 14 05:52:50 localhost podman[274325]: 2025-10-14 09:52:50.866659037 +0000 UTC m=+0.061377492 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:52:50 localhost podman[274325]: 2025-10-14 09:52:50.875918972 +0000 UTC m=+0.070637427 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:52:50 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:52:50 localhost nova_compute[238069]: 2025-10-14 09:52:50.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:51 localhost python3.9[274386]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:52:51 localhost nova_compute[238069]: 2025-10-14 09:52:51.303 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:52:51 localhost nova_compute[238069]: 2025-10-14 09:52:51.332 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:52:51 localhost nova_compute[238069]: 2025-10-14 09:52:51.333 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:52:51 localhost nova_compute[238069]: 2025-10-14 09:52:51.333 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:51 localhost nova_compute[238069]: 2025-10-14 09:52:51.334 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:51 localhost nova_compute[238069]: 2025-10-14 09:52:51.334 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:51 localhost python3.9[274496]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:52 localhost nova_compute[238069]: 2025-10-14 09:52:52.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:52 localhost nova_compute[238069]: 2025-10-14 09:52:52.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:52:52 localhost python3.9[274606]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:52:53 localhost nova_compute[238069]: 2025-10-14 09:52:53.020 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:53 localhost python3.9[274663]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54790 DF PROTO=TCP SPT=59770 DPT=9102 SEQ=1748829389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B158DC50000000001030307) Oct 14 05:52:53 localhost python3.9[274773]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.054 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.055 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.056 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.056 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.056 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:54 localhost python3.9[274831]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.499 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:52:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54791 DF PROTO=TCP SPT=59770 DPT=9102 SEQ=1748829389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1591DA0000000001030307) Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.590 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.591 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.823 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.824 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11787MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.825 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.825 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.893 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.893 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.894 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:52:54 localhost nova_compute[238069]: 2025-10-14 09:52:54.950 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:52:55 localhost nova_compute[238069]: 2025-10-14 09:52:55.381 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:52:55 localhost nova_compute[238069]: 2025-10-14 09:52:55.388 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:52:55 localhost nova_compute[238069]: 2025-10-14 09:52:55.404 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:52:55 localhost nova_compute[238069]: 2025-10-14 09:52:55.406 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:52:55 localhost nova_compute[238069]: 2025-10-14 09:52:55.407 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:52:55 localhost nova_compute[238069]: 2025-10-14 09:52:55.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:56 localhost python3.9[274984]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:52:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54792 DF PROTO=TCP SPT=59770 DPT=9102 SEQ=1748829389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1599DA0000000001030307) Oct 14 05:52:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:52:57.759 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:52:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:52:57.760 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:52:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:52:57.761 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:52:58 localhost python3.9[275094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:52:58 localhost podman[248187]: time="2025-10-14T09:52:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:52:58 localhost podman[248187]: @ - - [14/Oct/2025:09:52:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141160 "" "Go-http-client/1.1" Oct 14 05:52:58 localhost podman[248187]: @ - - [14/Oct/2025:09:52:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18334 "" "Go-http-client/1.1" Oct 14 05:52:58 localhost python3.9[275151]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:52:59 localhost nova_compute[238069]: 2025-10-14 09:52:59.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:52:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:52:59 localhost systemd[1]: tmp-crun.kBJOCd.mount: Deactivated successfully. Oct 14 05:52:59 localhost podman[275262]: 2025-10-14 09:52:59.269204359 +0000 UTC m=+0.080088117 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3) Oct 14 05:52:59 localhost podman[275262]: 2025-10-14 09:52:59.303064327 +0000 UTC m=+0.113948105 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 05:52:59 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:52:59 localhost python3.9[275261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:52:59 localhost python3.9[275337]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54793 DF PROTO=TCP SPT=59770 DPT=9102 SEQ=1748829389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B15A99A0000000001030307) Oct 14 05:53:00 localhost python3.9[275447]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:53:00 localhost systemd[1]: Reloading. Oct 14 05:53:00 localhost systemd-sysv-generator[275475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:53:00 localhost systemd-rc-local-generator[275472]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:53:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:53:01 localhost nova_compute[238069]: 2025-10-14 09:53:00.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:53:01 localhost podman[275485]: 2025-10-14 09:53:01.143533688 +0000 UTC m=+0.091163437 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:53:01 localhost podman[275485]: 2025-10-14 09:53:01.15471408 +0000 UTC m=+0.102343779 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:53:01 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:53:01 localhost python3.9[275618]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:02 localhost python3.9[275675]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:03 localhost python3.9[275785]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:03 localhost python3.9[275842]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:53:03 localhost systemd[1]: tmp-crun.rqGTUP.mount: Deactivated successfully. Oct 14 05:53:03 localhost podman[275860]: 2025-10-14 09:53:03.753010727 +0000 UTC m=+0.087738081 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:53:03 localhost podman[275860]: 2025-10-14 09:53:03.79419831 +0000 UTC m=+0.128925654 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 05:53:03 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:53:04 localhost nova_compute[238069]: 2025-10-14 09:53:04.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:04 localhost python3.9[275971]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:53:04 localhost systemd[1]: Reloading. Oct 14 05:53:04 localhost systemd-rc-local-generator[275993]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:53:04 localhost systemd-sysv-generator[275997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:53:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:53:05 localhost systemd[1]: Starting Create netns directory... Oct 14 05:53:05 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:53:05 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:53:05 localhost systemd[1]: Finished Create netns directory. Oct 14 05:53:06 localhost nova_compute[238069]: 2025-10-14 09:53:06.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:06 localhost sshd[276031]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:53:06 localhost python3.9[276124]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:53:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:53:07 localhost podman[276142]: 2025-10-14 09:53:07.768523208 +0000 UTC m=+0.101025699 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.build-date=20251009) Oct 14 05:53:07 localhost podman[276142]: 2025-10-14 09:53:07.845408745 +0000 UTC m=+0.177911266 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:53:07 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:53:07 localhost podman[276143]: 2025-10-14 09:53:07.866702568 +0000 UTC m=+0.198090375 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350) Oct 14 05:53:07 localhost podman[276143]: 2025-10-14 09:53:07.91016067 +0000 UTC m=+0.241548397 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc.) Oct 14 05:53:07 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:53:08 localhost python3.9[276280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:08 localhost openstack_network_exporter[250374]: ERROR 09:53:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:53:08 localhost openstack_network_exporter[250374]: ERROR 09:53:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:53:08 localhost openstack_network_exporter[250374]: ERROR 09:53:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:53:08 localhost openstack_network_exporter[250374]: ERROR 09:53:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:53:08 localhost openstack_network_exporter[250374]: Oct 14 05:53:08 localhost openstack_network_exporter[250374]: ERROR 09:53:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:53:08 localhost openstack_network_exporter[250374]: Oct 14 05:53:08 localhost python3.9[276337]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/iscsid/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/iscsid/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:09 localhost nova_compute[238069]: 2025-10-14 09:53:09.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:53:09 localhost podman[276355]: 2025-10-14 09:53:09.756223173 +0000 UTC m=+0.097054687 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:53:09 localhost podman[276355]: 2025-10-14 09:53:09.768111448 +0000 UTC m=+0.108942942 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:53:09 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:53:10 localhost python3.9[276466]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:11 localhost nova_compute[238069]: 2025-10-14 09:53:11.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:11 localhost python3.9[276576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:11 localhost python3.9[276633]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/iscsid.json _original_basename=.3swwp0xu recurse=False state=file path=/var/lib/kolla/config_files/iscsid.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:12 localhost python3.9[276743]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:14 localhost nova_compute[238069]: 2025-10-14 09:53:14.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:14 localhost python3.9[277020]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False Oct 14 05:53:15 localhost python3.9[277130]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:53:16 localhost nova_compute[238069]: 2025-10-14 09:53:16.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:16 localhost python3.9[277240]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:53:19 localhost nova_compute[238069]: 2025-10-14 09:53:19.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:21 localhost nova_compute[238069]: 2025-10-14 09:53:21.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:53:21 localhost podman[277364]: 2025-10-14 09:53:21.579456747 +0000 UTC m=+0.097297444 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:53:21 localhost podman[277364]: 2025-10-14 09:53:21.592089384 +0000 UTC m=+0.109930131 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:53:21 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:53:21 localhost podman[277367]: 2025-10-14 09:53:21.680235977 +0000 UTC m=+0.196355611 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:53:21 localhost podman[277367]: 2025-10-14 09:53:21.714037883 +0000 UTC m=+0.230157467 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:53:21 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:53:21 localhost python3[277399]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:53:22 localhost python3[277399]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "4f44a4f5e0315c0d3dbd533e21d0927bf0518cf452942382901ff1ff9d621cbd",#012 "Digest": "sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-iscsid:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-iscsid@sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-14T06:14:08.154480843Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 403858061,#012 "VirtualSize": 403858061,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec/diff:/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",#012 "sha256:f640179b0564dc7abbe22bd39fc8810d5bbb8e54094fe7ebc5b3c45b658c4983",#012 "sha256:f004953af60f7a99c360488169b0781a154164be09dce508bd68d57932c60f8f"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-14T06:08:54.969219151Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969253522Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969285133Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969308103Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969342284Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969363945Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:55.340499198Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:32.389605838Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:35.587912811Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which Oct 14 05:53:23 localhost python3.9[277587]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:53:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9631 DF PROTO=TCP SPT=57070 DPT=9102 SEQ=234676103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1602F60000000001030307) Oct 14 05:53:24 localhost python3.9[277699]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:24 localhost nova_compute[238069]: 2025-10-14 09:53:24.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9632 DF PROTO=TCP SPT=57070 DPT=9102 SEQ=234676103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B16071A0000000001030307) Oct 14 05:53:24 localhost python3.9[277754]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:53:25 localhost python3.9[277863]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435604.7988477-988-41006625397526/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:26 localhost nova_compute[238069]: 2025-10-14 09:53:26.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:26 localhost python3.9[277918]: ansible-systemd Invoked with state=started name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:53:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9633 DF PROTO=TCP SPT=57070 DPT=9102 SEQ=234676103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B160F1A0000000001030307) Oct 14 05:53:28 localhost python3.9[278028]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:53:28 localhost podman[248187]: time="2025-10-14T09:53:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:53:28 localhost podman[248187]: @ - - [14/Oct/2025:09:53:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141160 "" "Go-http-client/1.1" Oct 14 05:53:28 localhost podman[248187]: @ - - [14/Oct/2025:09:53:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18331 "" "Go-http-client/1.1" Oct 14 05:53:29 localhost python3.9[278140]: ansible-ansible.builtin.systemd Invoked with name=edpm_iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:53:29 localhost systemd[1]: Stopping iscsid container... Oct 14 05:53:29 localhost iscsid[217542]: iscsid shutting down. Oct 14 05:53:29 localhost systemd[1]: libpod-fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.scope: Deactivated successfully. Oct 14 05:53:29 localhost podman[278144]: 2025-10-14 09:53:29.226095872 +0000 UTC m=+0.079220550 container died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.timer: Deactivated successfully. Oct 14 05:53:29 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Failed to open /run/systemd/transient/fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: No such file or directory Oct 14 05:53:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29-userdata-shm.mount: Deactivated successfully. Oct 14 05:53:29 localhost systemd[1]: var-lib-containers-storage-overlay-49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24-merged.mount: Deactivated successfully. Oct 14 05:53:29 localhost nova_compute[238069]: 2025-10-14 09:53:29.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:29 localhost podman[278144]: 2025-10-14 09:53:29.372832511 +0000 UTC m=+0.225957129 container cleanup fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:53:29 localhost podman[278144]: iscsid Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.timer: Failed to open /run/systemd/transient/fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.timer: No such file or directory Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Failed to open /run/systemd/transient/fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: No such file or directory Oct 14 05:53:29 localhost podman[278171]: 2025-10-14 09:53:29.477850971 +0000 UTC m=+0.073117613 container cleanup fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 05:53:29 localhost podman[278171]: iscsid Oct 14 05:53:29 localhost systemd[1]: edpm_iscsid.service: Deactivated successfully. Oct 14 05:53:29 localhost systemd[1]: Stopped iscsid container. Oct 14 05:53:29 localhost systemd[1]: Starting iscsid container... Oct 14 05:53:29 localhost systemd[1]: Started libcrun container. Oct 14 05:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24/merged/etc/target supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49c20d678c288528670e909b37629f6fe8db82db90dafb25b7a74d603708ca24/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.timer: Failed to open /run/systemd/transient/fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.timer: No such file or directory Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Failed to open /run/systemd/transient/fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: No such file or directory Oct 14 05:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:53:29 localhost podman[278184]: 2025-10-14 09:53:29.657624214 +0000 UTC m=+0.143584415 container init fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid) Oct 14 05:53:29 localhost iscsid[278198]: + sudo -E kolla_set_configs Oct 14 05:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:53:29 localhost podman[278184]: 2025-10-14 09:53:29.699128915 +0000 UTC m=+0.185089066 container start fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 05:53:29 localhost podman[278184]: iscsid Oct 14 05:53:29 localhost systemd[1]: Started iscsid container. Oct 14 05:53:29 localhost systemd[1]: Created slice User Slice of UID 0. Oct 14 05:53:29 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Oct 14 05:53:29 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Oct 14 05:53:29 localhost systemd[1]: Starting User Manager for UID 0... Oct 14 05:53:29 localhost podman[278206]: 2025-10-14 09:53:29.778928893 +0000 UTC m=+0.084323407 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true) Oct 14 05:53:29 localhost podman[278206]: 2025-10-14 09:53:29.817198325 +0000 UTC m=+0.122592799 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:53:29 localhost podman[278206]: unhealthy Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Main process exited, code=exited, status=1/FAILURE Oct 14 05:53:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Failed with result 'exit-code'. Oct 14 05:53:29 localhost systemd[278216]: Queued start job for default target Main User Target. Oct 14 05:53:29 localhost systemd[278216]: Created slice User Application Slice. Oct 14 05:53:29 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Oct 14 05:53:29 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:53:29 localhost systemd[278216]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Oct 14 05:53:29 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:53:29 localhost systemd[278216]: Started Daily Cleanup of User's Temporary Directories. Oct 14 05:53:29 localhost systemd[278216]: Reached target Paths. Oct 14 05:53:29 localhost systemd[278216]: Reached target Timers. Oct 14 05:53:29 localhost systemd[278216]: Starting D-Bus User Message Bus Socket... Oct 14 05:53:29 localhost systemd[278216]: Starting Create User's Volatile Files and Directories... Oct 14 05:53:29 localhost systemd[278216]: Listening on D-Bus User Message Bus Socket. Oct 14 05:53:29 localhost systemd[278216]: Reached target Sockets. Oct 14 05:53:29 localhost systemd[278216]: Finished Create User's Volatile Files and Directories. Oct 14 05:53:29 localhost systemd[278216]: Reached target Basic System. Oct 14 05:53:29 localhost systemd[278216]: Reached target Main User Target. Oct 14 05:53:29 localhost systemd[278216]: Startup finished in 164ms. Oct 14 05:53:29 localhost systemd[1]: Started User Manager for UID 0. Oct 14 05:53:29 localhost systemd[1]: Started Session c16 of User root. Oct 14 05:53:30 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:53:30 localhost iscsid[278198]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:53:30 localhost iscsid[278198]: INFO:__main__:Validating config file Oct 14 05:53:30 localhost iscsid[278198]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:53:30 localhost iscsid[278198]: INFO:__main__:Writing out command to execute Oct 14 05:53:30 localhost systemd[1]: session-c16.scope: Deactivated successfully. Oct 14 05:53:30 localhost iscsid[278198]: ++ cat /run_command Oct 14 05:53:30 localhost iscsid[278198]: + CMD='/usr/sbin/iscsid -f' Oct 14 05:53:30 localhost iscsid[278198]: + ARGS= Oct 14 05:53:30 localhost iscsid[278198]: + sudo kolla_copy_cacerts Oct 14 05:53:30 localhost systemd[1]: Started Session c17 of User root. Oct 14 05:53:30 localhost systemd[1]: session-c17.scope: Deactivated successfully. Oct 14 05:53:30 localhost iscsid[278198]: + [[ ! -n '' ]] Oct 14 05:53:30 localhost iscsid[278198]: + . kolla_extend_start Oct 14 05:53:30 localhost iscsid[278198]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]] Oct 14 05:53:30 localhost iscsid[278198]: Running command: '/usr/sbin/iscsid -f' Oct 14 05:53:30 localhost iscsid[278198]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\''' Oct 14 05:53:30 localhost iscsid[278198]: + umask 0022 Oct 14 05:53:30 localhost iscsid[278198]: + exec /usr/sbin/iscsid -f Oct 14 05:53:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9634 DF PROTO=TCP SPT=57070 DPT=9102 SEQ=234676103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B161EDA0000000001030307) Oct 14 05:53:31 localhost nova_compute[238069]: 2025-10-14 09:53:31.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:53:31 localhost podman[278355]: 2025-10-14 09:53:31.34238196 +0000 UTC m=+0.084891314 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:53:31 localhost podman[278355]: 2025-10-14 09:53:31.350512519 +0000 UTC m=+0.093021923 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:53:31 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:53:31 localhost python3.9[278354]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:32 localhost python3.9[278487]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:53:32 localhost network[278504]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:53:32 localhost network[278505]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:53:32 localhost network[278506]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:53:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:53:33 localhost podman[278537]: 2025-10-14 09:53:33.924772588 +0000 UTC m=+0.086707549 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Oct 14 05:53:33 localhost podman[278537]: 2025-10-14 09:53:33.940301484 +0000 UTC m=+0.102236455 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:53:33 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:53:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:53:34 localhost nova_compute[238069]: 2025-10-14 09:53:34.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:36 localhost nova_compute[238069]: 2025-10-14 09:53:36.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:53:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:53:38 localhost podman[278686]: 2025-10-14 09:53:38.031413042 +0000 UTC m=+0.074458804 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:53:38 localhost podman[278686]: 2025-10-14 09:53:38.11974684 +0000 UTC m=+0.162792642 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Oct 14 05:53:38 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:53:38 localhost podman[278687]: 2025-10-14 09:53:38.119436821 +0000 UTC m=+0.156714317 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=) Oct 14 05:53:38 localhost podman[278687]: 2025-10-14 09:53:38.202178428 +0000 UTC m=+0.239456014 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public) Oct 14 05:53:38 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:53:38 localhost openstack_network_exporter[250374]: ERROR 09:53:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:53:38 localhost openstack_network_exporter[250374]: ERROR 09:53:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:53:38 localhost openstack_network_exporter[250374]: ERROR 09:53:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:53:38 localhost openstack_network_exporter[250374]: ERROR 09:53:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:53:38 localhost openstack_network_exporter[250374]: Oct 14 05:53:38 localhost openstack_network_exporter[250374]: ERROR 09:53:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:53:38 localhost openstack_network_exporter[250374]: Oct 14 05:53:38 localhost python3.9[278868]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:53:39 localhost nova_compute[238069]: 2025-10-14 09:53:39.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:39 localhost podman[279055]: Oct 14 05:53:39 localhost podman[279055]: 2025-10-14 09:53:39.691871363 +0000 UTC m=+0.069700377 container create 394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_austin, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux , release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64) Oct 14 05:53:39 localhost systemd[1]: Started libpod-conmon-394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1.scope. Oct 14 05:53:39 localhost systemd[1]: Started libcrun container. Oct 14 05:53:39 localhost podman[279055]: 2025-10-14 09:53:39.659005136 +0000 UTC m=+0.036834230 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 05:53:39 localhost podman[279055]: 2025-10-14 09:53:39.768087841 +0000 UTC m=+0.145916885 container init 394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_austin, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git) Oct 14 05:53:39 localhost podman[279055]: 2025-10-14 09:53:39.780536352 +0000 UTC m=+0.158365386 container start 394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_austin, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, maintainer=Guillaume Abrioux , version=7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, name=rhceph) Oct 14 05:53:39 localhost podman[279055]: 2025-10-14 09:53:39.780959415 +0000 UTC m=+0.158788489 container attach 394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_austin, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7) Oct 14 05:53:39 localhost stupefied_austin[279071]: 167 167 Oct 14 05:53:39 localhost systemd[1]: libpod-394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1.scope: Deactivated successfully. Oct 14 05:53:39 localhost podman[279055]: 2025-10-14 09:53:39.792779447 +0000 UTC m=+0.170608551 container died 394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_austin, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, distribution-scope=public) Oct 14 05:53:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:53:39 localhost python3.9[279065]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Oct 14 05:53:39 localhost podman[279076]: 2025-10-14 09:53:39.908384532 +0000 UTC m=+0.103823224 container remove 394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_austin, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, release=553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , name=rhceph) Oct 14 05:53:39 localhost systemd[1]: libpod-conmon-394a138278bdef50eb575cb4019dbfe36bd1f9b3ad430b4daf97c9315cde78b1.scope: Deactivated successfully. Oct 14 05:53:39 localhost podman[279077]: 2025-10-14 09:53:39.973813698 +0000 UTC m=+0.153210389 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 05:53:40 localhost podman[279077]: 2025-10-14 09:53:40.011354119 +0000 UTC m=+0.190750790 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Oct 14 05:53:40 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:53:40 localhost podman[279135]: Oct 14 05:53:40 localhost podman[279135]: 2025-10-14 09:53:40.092640451 +0000 UTC m=+0.060564788 container create f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hermann, io.openshift.expose-services=, name=rhceph, RELEASE=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 05:53:40 localhost systemd[1]: Started libpod-conmon-f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8.scope. Oct 14 05:53:40 localhost systemd[1]: Stopping User Manager for UID 0... Oct 14 05:53:40 localhost systemd[278216]: Activating special unit Exit the Session... Oct 14 05:53:40 localhost systemd[278216]: Stopped target Main User Target. Oct 14 05:53:40 localhost systemd[278216]: Stopped target Basic System. Oct 14 05:53:40 localhost systemd[278216]: Stopped target Paths. Oct 14 05:53:40 localhost systemd[278216]: Stopped target Sockets. Oct 14 05:53:40 localhost systemd[278216]: Stopped target Timers. Oct 14 05:53:40 localhost systemd[278216]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 05:53:40 localhost systemd[278216]: Closed D-Bus User Message Bus Socket. Oct 14 05:53:40 localhost systemd[278216]: Stopped Create User's Volatile Files and Directories. Oct 14 05:53:40 localhost systemd[278216]: Removed slice User Application Slice. Oct 14 05:53:40 localhost systemd[278216]: Reached target Shutdown. Oct 14 05:53:40 localhost systemd[278216]: Finished Exit the Session. Oct 14 05:53:40 localhost systemd[278216]: Reached target Exit the Session. Oct 14 05:53:40 localhost systemd[1]: Started libcrun container. Oct 14 05:53:40 localhost systemd[1]: user@0.service: Deactivated successfully. Oct 14 05:53:40 localhost systemd[1]: Stopped User Manager for UID 0. Oct 14 05:53:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97aee8bedd4dbf47f2ec22b5eb60af0b6828578726cd68bc804a2046bc0175a4/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:40 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Oct 14 05:53:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97aee8bedd4dbf47f2ec22b5eb60af0b6828578726cd68bc804a2046bc0175a4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97aee8bedd4dbf47f2ec22b5eb60af0b6828578726cd68bc804a2046bc0175a4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97aee8bedd4dbf47f2ec22b5eb60af0b6828578726cd68bc804a2046bc0175a4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 05:53:40 localhost podman[279135]: 2025-10-14 09:53:40.065961513 +0000 UTC m=+0.033885760 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 05:53:40 localhost podman[279135]: 2025-10-14 09:53:40.169656263 +0000 UTC m=+0.137580530 container init f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hermann, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 05:53:40 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Oct 14 05:53:40 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Oct 14 05:53:40 localhost systemd[1]: Removed slice User Slice of UID 0. Oct 14 05:53:40 localhost podman[279135]: 2025-10-14 09:53:40.183061744 +0000 UTC m=+0.150985991 container start f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hermann, architecture=x86_64, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, ceph=True) Oct 14 05:53:40 localhost podman[279135]: 2025-10-14 09:53:40.183497707 +0000 UTC m=+0.151422014 container attach f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hermann, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main) Oct 14 05:53:40 localhost systemd[1]: var-lib-containers-storage-overlay-02a4bd200eeddc4052d2ac504b72d5312d11a3aae9e79ce05e24b079b890910e-merged.mount: Deactivated successfully. Oct 14 05:53:40 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Oct 14 05:53:40 localhost python3.9[279252]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:41 localhost nova_compute[238069]: 2025-10-14 09:53:41.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:41 localhost python3.9[280392]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:41 localhost nifty_hermann[279162]: [ Oct 14 05:53:41 localhost nifty_hermann[279162]: { Oct 14 05:53:41 localhost nifty_hermann[279162]: "available": false, Oct 14 05:53:41 localhost nifty_hermann[279162]: "ceph_device": false, Oct 14 05:53:41 localhost nifty_hermann[279162]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 05:53:41 localhost nifty_hermann[279162]: "lsm_data": {}, Oct 14 05:53:41 localhost nifty_hermann[279162]: "lvs": [], Oct 14 05:53:41 localhost nifty_hermann[279162]: "path": "/dev/sr0", Oct 14 05:53:41 localhost nifty_hermann[279162]: "rejected_reasons": [ Oct 14 05:53:41 localhost nifty_hermann[279162]: "Insufficient space (<5GB)", Oct 14 05:53:41 localhost nifty_hermann[279162]: "Has a FileSystem" Oct 14 05:53:41 localhost nifty_hermann[279162]: ], Oct 14 05:53:41 localhost nifty_hermann[279162]: "sys_api": { Oct 14 05:53:41 localhost nifty_hermann[279162]: "actuators": null, Oct 14 05:53:41 localhost nifty_hermann[279162]: "device_nodes": "sr0", Oct 14 05:53:41 localhost nifty_hermann[279162]: "human_readable_size": "482.00 KB", Oct 14 05:53:41 localhost nifty_hermann[279162]: "id_bus": "ata", Oct 14 05:53:41 localhost nifty_hermann[279162]: "model": "QEMU DVD-ROM", Oct 14 05:53:41 localhost nifty_hermann[279162]: "nr_requests": "2", Oct 14 05:53:41 localhost nifty_hermann[279162]: "partitions": {}, Oct 14 05:53:41 localhost nifty_hermann[279162]: "path": "/dev/sr0", Oct 14 05:53:41 localhost nifty_hermann[279162]: "removable": "1", Oct 14 05:53:41 localhost nifty_hermann[279162]: "rev": "2.5+", Oct 14 05:53:41 localhost nifty_hermann[279162]: "ro": "0", Oct 14 05:53:41 localhost nifty_hermann[279162]: "rotational": "1", Oct 14 05:53:41 localhost nifty_hermann[279162]: "sas_address": "", Oct 14 05:53:41 localhost nifty_hermann[279162]: "sas_device_handle": "", Oct 14 05:53:41 localhost nifty_hermann[279162]: "scheduler_mode": "mq-deadline", Oct 14 05:53:41 localhost nifty_hermann[279162]: "sectors": 0, Oct 14 05:53:41 localhost nifty_hermann[279162]: "sectorsize": "2048", Oct 14 05:53:41 localhost nifty_hermann[279162]: "size": 493568.0, Oct 14 05:53:41 localhost nifty_hermann[279162]: "support_discard": "0", Oct 14 05:53:41 localhost nifty_hermann[279162]: "type": "disk", Oct 14 05:53:41 localhost nifty_hermann[279162]: "vendor": "QEMU" Oct 14 05:53:41 localhost nifty_hermann[279162]: } Oct 14 05:53:41 localhost nifty_hermann[279162]: } Oct 14 05:53:41 localhost nifty_hermann[279162]: ] Oct 14 05:53:41 localhost systemd[1]: libpod-f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8.scope: Deactivated successfully. Oct 14 05:53:41 localhost systemd[1]: libpod-f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8.scope: Consumed 1.074s CPU time. Oct 14 05:53:41 localhost podman[279135]: 2025-10-14 09:53:41.266353828 +0000 UTC m=+1.234278065 container died f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hermann, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, release=553, io.buildah.version=1.33.12, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 05:53:41 localhost systemd[1]: tmp-crun.fJJSXl.mount: Deactivated successfully. Oct 14 05:53:41 localhost systemd[1]: var-lib-containers-storage-overlay-97aee8bedd4dbf47f2ec22b5eb60af0b6828578726cd68bc804a2046bc0175a4-merged.mount: Deactivated successfully. Oct 14 05:53:41 localhost podman[281472]: 2025-10-14 09:53:41.394742535 +0000 UTC m=+0.112607404 container remove f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hermann, io.buildah.version=1.33.12, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, distribution-scope=public, name=rhceph) Oct 14 05:53:41 localhost systemd[1]: libpod-conmon-f9ecb36b8ee454e7db9a5353221b57aef6cd1efa4cd41f372e8b6941a3d30ac8.scope: Deactivated successfully. Oct 14 05:53:42 localhost python3.9[281580]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:43 localhost python3.9[281708]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:44 localhost python3.9[281818]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:53:44 localhost nova_compute[238069]: 2025-10-14 09:53:44.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:45 localhost python3.9[281930]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:53:46 localhost nova_compute[238069]: 2025-10-14 09:53:46.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:47 localhost python3.9[282042]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:53:47 localhost nova_compute[238069]: 2025-10-14 09:53:47.403 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:47 localhost nova_compute[238069]: 2025-10-14 09:53:47.520 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:47 localhost python3.9[282153]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:48 localhost nova_compute[238069]: 2025-10-14 09:53:48.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:48 localhost python3.9[282263]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:49 localhost nova_compute[238069]: 2025-10-14 09:53:49.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:49 localhost python3.9[282373]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:53:50 localhost python3.9[282483]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.764 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.765 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.765 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:53:50 localhost nova_compute[238069]: 2025-10-14 09:53:50.765 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:53:50 localhost python3.9[282593]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:51 localhost nova_compute[238069]: 2025-10-14 09:53:51.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:51 localhost python3.9[282703]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:53:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:53:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:53:51 localhost podman[282717]: 2025-10-14 09:53:51.773478849 +0000 UTC m=+0.098544003 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:53:51 localhost podman[282717]: 2025-10-14 09:53:51.787764847 +0000 UTC m=+0.112830041 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:53:51 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:53:51 localhost podman[282741]: 2025-10-14 09:53:51.869882455 +0000 UTC m=+0.086582516 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:53:51 localhost podman[282741]: 2025-10-14 09:53:51.903227007 +0000 UTC m=+0.119927058 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Oct 14 05:53:51 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.091 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.181 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.182 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.183 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.183 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.183 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.184 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:52 localhost nova_compute[238069]: 2025-10-14 09:53:52.184 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:53:52 localhost python3.9[282855]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:53 localhost python3.9[282965]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7265 DF PROTO=TCP SPT=51762 DPT=9102 SEQ=1621284766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1678250000000001030307) Oct 14 05:53:53 localhost python3.9[283022]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.044 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.045 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.045 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.046 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.046 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:53:54 localhost python3.9[283152]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.480 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:53:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7266 DF PROTO=TCP SPT=51762 DPT=9102 SEQ=1621284766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B167C1A0000000001030307) Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.551 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.552 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.760 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.761 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11774MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.761 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.762 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.828 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.828 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.829 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:53:54 localhost nova_compute[238069]: 2025-10-14 09:53:54.864 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:53:55 localhost nova_compute[238069]: 2025-10-14 09:53:55.305 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:53:55 localhost nova_compute[238069]: 2025-10-14 09:53:55.312 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:53:55 localhost nova_compute[238069]: 2025-10-14 09:53:55.333 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:53:55 localhost nova_compute[238069]: 2025-10-14 09:53:55.336 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:53:55 localhost nova_compute[238069]: 2025-10-14 09:53:55.336 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:53:55 localhost python3.9[283231]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:53:56 localhost python3.9[283343]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:56 localhost nova_compute[238069]: 2025-10-14 09:53:56.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:53:56 localhost nova_compute[238069]: 2025-10-14 09:53:56.332 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:53:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7267 DF PROTO=TCP SPT=51762 DPT=9102 SEQ=1621284766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B16841A0000000001030307) Oct 14 05:53:56 localhost python3.9[283453]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:53:57.762 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:53:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:53:57.762 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:53:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:53:57.763 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:53:57 localhost python3.9[283510]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:58 localhost podman[248187]: time="2025-10-14T09:53:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:53:58 localhost podman[248187]: @ - - [14/Oct/2025:09:53:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141159 "" "Go-http-client/1.1" Oct 14 05:53:58 localhost podman[248187]: @ - - [14/Oct/2025:09:53:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18330 "" "Go-http-client/1.1" Oct 14 05:53:58 localhost python3.9[283620]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:53:59 localhost python3.9[283677]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:53:59 localhost nova_compute[238069]: 2025-10-14 09:53:59.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:00 localhost python3.9[283787]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:54:00 localhost systemd[1]: Reloading. Oct 14 05:54:00 localhost podman[283789]: 2025-10-14 09:54:00.14242193 +0000 UTC m=+0.083916294 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:54:00 localhost systemd-rc-local-generator[283828]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:54:00 localhost systemd-sysv-generator[283832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:54:00 localhost podman[283789]: 2025-10-14 09:54:00.191148254 +0000 UTC m=+0.132642578 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 05:54:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:54:00 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:54:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7268 DF PROTO=TCP SPT=51762 DPT=9102 SEQ=1621284766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1693DA0000000001030307) Oct 14 05:54:01 localhost python3.9[283953]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:54:01 localhost nova_compute[238069]: 2025-10-14 09:54:01.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:54:01 localhost podman[284011]: 2025-10-14 09:54:01.640182902 +0000 UTC m=+0.099218563 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:54:01 localhost podman[284011]: 2025-10-14 09:54:01.65316281 +0000 UTC m=+0.112198451 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:54:01 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:54:01 localhost python3.9[284010]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:02 localhost python3.9[284143]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:54:02 localhost python3.9[284200]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:03 localhost python3.9[284310]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:03 localhost systemd[1]: Reloading. Oct 14 05:54:03 localhost systemd-rc-local-generator[284335]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:54:03 localhost systemd-sysv-generator[284339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:54:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:54:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:54:04 localhost systemd[1]: Starting Create netns directory... Oct 14 05:54:04 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Oct 14 05:54:04 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Oct 14 05:54:04 localhost systemd[1]: Finished Create netns directory. Oct 14 05:54:04 localhost podman[284347]: 2025-10-14 09:54:04.289456582 +0000 UTC m=+0.081864891 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 05:54:04 localhost podman[284347]: 2025-10-14 09:54:04.325738085 +0000 UTC m=+0.118146444 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible) Oct 14 05:54:04 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:54:04 localhost nova_compute[238069]: 2025-10-14 09:54:04.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:05 localhost python3.9[284479]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:54:06 localhost python3.9[284589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:54:06 localhost nova_compute[238069]: 2025-10-14 09:54:06.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:07 localhost python3.9[284646]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:54:08 localhost podman[284718]: 2025-10-14 09:54:08.771119674 +0000 UTC m=+0.100557554 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:54:08 localhost openstack_network_exporter[250374]: ERROR 09:54:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:54:08 localhost openstack_network_exporter[250374]: ERROR 09:54:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:54:08 localhost openstack_network_exporter[250374]: ERROR 09:54:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:54:08 localhost openstack_network_exporter[250374]: ERROR 09:54:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:54:08 localhost openstack_network_exporter[250374]: Oct 14 05:54:08 localhost openstack_network_exporter[250374]: ERROR 09:54:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:54:08 localhost openstack_network_exporter[250374]: Oct 14 05:54:08 localhost podman[284719]: 2025-10-14 09:54:08.81762766 +0000 UTC m=+0.146632757 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 05:54:08 localhost podman[284719]: 2025-10-14 09:54:08.835079634 +0000 UTC m=+0.164084731 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible) Oct 14 05:54:08 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:54:08 localhost podman[284718]: 2025-10-14 09:54:08.868965684 +0000 UTC m=+0.198403564 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:54:08 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:54:09 localhost python3.9[284801]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:54:09 localhost nova_compute[238069]: 2025-10-14 09:54:09.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:10 localhost python3.9[284911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:54:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:54:10 localhost podman[284930]: 2025-10-14 09:54:10.437800597 +0000 UTC m=+0.097451649 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:54:10 localhost podman[284930]: 2025-10-14 09:54:10.473847731 +0000 UTC m=+0.133498733 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:54:10 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:54:10 localhost python3.9[284987]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.ut7wkrm9 recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:11 localhost nova_compute[238069]: 2025-10-14 09:54:11.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:11 localhost python3.9[285097]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:13 localhost python3.9[285374]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Oct 14 05:54:14 localhost nova_compute[238069]: 2025-10-14 09:54:14.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:14 localhost python3.9[285484]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:54:15 localhost python3.9[285594]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Oct 14 05:54:16 localhost nova_compute[238069]: 2025-10-14 09:54:16.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:19 localhost nova_compute[238069]: 2025-10-14 09:54:19.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:20 localhost python3[285729]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:54:20 localhost python3[285729]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "0cc989a5ef996507b0f9d8ef7fc230c93fad4ad33debd19bbe24250b85566285",#012 "Digest": "sha256:7b5e7d0bff1c705215946e167be50eac031a93886d33e2e88e389776e8e13e70",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:7b5e7d0bff1c705215946e167be50eac031a93886d33e2e88e389776e8e13e70"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-14T06:10:30.956277521Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 249351661,#012 "VirtualSize": 249351661,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",#012 "sha256:3be5c7cbc12431945afa672da84f6330a9da4cc765276b49a4ad90cf80ae26d7"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-14T06:08:54.969219151Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969253522Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969285133Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969308103Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969342284Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:54.969363945Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:08:55.340499198Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:32.389605838Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:35.587912811Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-14T06:09:35.976619634Z",#012 Oct 14 05:54:21 localhost nova_compute[238069]: 2025-10-14 09:54:21.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:54:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:54:22 localhost podman[285902]: 2025-10-14 09:54:22.331276364 +0000 UTC m=+0.091275760 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent) Oct 14 05:54:22 localhost podman[285902]: 2025-10-14 09:54:22.365163602 +0000 UTC m=+0.125163018 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 05:54:22 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:54:22 localhost podman[285901]: 2025-10-14 09:54:22.388939852 +0000 UTC m=+0.151092434 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:54:22 localhost podman[285901]: 2025-10-14 09:54:22.426035709 +0000 UTC m=+0.188188311 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:54:22 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:54:22 localhost python3.9[285900]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:54:23 localhost python3.9[286053]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5668 DF PROTO=TCP SPT=37418 DPT=9102 SEQ=3968490608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B16ED540000000001030307) Oct 14 05:54:23 localhost python3.9[286108]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:54:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5669 DF PROTO=TCP SPT=37418 DPT=9102 SEQ=3968490608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B16F15A0000000001030307) Oct 14 05:54:24 localhost python3.9[286217]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435663.9088752-2191-30047893800672/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:24 localhost nova_compute[238069]: 2025-10-14 09:54:24.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:25 localhost python3.9[286272]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:25 localhost python3.9[286382]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:54:26 localhost nova_compute[238069]: 2025-10-14 09:54:26.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5670 DF PROTO=TCP SPT=37418 DPT=9102 SEQ=3968490608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B16F95A0000000001030307) Oct 14 05:54:26 localhost python3.9[286492]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:27 localhost python3.9[286602]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Oct 14 05:54:28 localhost podman[248187]: time="2025-10-14T09:54:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:54:28 localhost podman[248187]: @ - - [14/Oct/2025:09:54:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:54:28 localhost podman[248187]: @ - - [14/Oct/2025:09:54:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18329 "" "Go-http-client/1.1" Oct 14 05:54:28 localhost python3.9[286712]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Oct 14 05:54:29 localhost python3.9[286822]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:54:29 localhost nova_compute[238069]: 2025-10-14 09:54:29.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:30 localhost python3.9[286879]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5671 DF PROTO=TCP SPT=37418 DPT=9102 SEQ=3968490608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B17091A0000000001030307) Oct 14 05:54:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:54:30 localhost podman[286951]: 2025-10-14 09:54:30.737203469 +0000 UTC m=+0.081559883 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid) Oct 14 05:54:30 localhost podman[286951]: 2025-10-14 09:54:30.747944869 +0000 UTC m=+0.092301233 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:54:30 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:54:31 localhost nova_compute[238069]: 2025-10-14 09:54:31.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:31 localhost python3.9[287009]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:54:32 localhost systemd[1]: tmp-crun.ysZKr9.mount: Deactivated successfully. Oct 14 05:54:32 localhost podman[287120]: 2025-10-14 09:54:32.511536298 +0000 UTC m=+0.101426394 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:54:32 localhost podman[287120]: 2025-10-14 09:54:32.549200809 +0000 UTC m=+0.139090895 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:54:32 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:54:32 localhost python3.9[287119]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 14 05:54:34 localhost python3.9[287203]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 14 05:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:54:34 localhost podman[287206]: 2025-10-14 09:54:34.745546725 +0000 UTC m=+0.090425016 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:54:34 localhost podman[287206]: 2025-10-14 09:54:34.755956315 +0000 UTC m=+0.100834536 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.license=GPLv2) Oct 14 05:54:34 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:54:34 localhost nova_compute[238069]: 2025-10-14 09:54:34.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:36 localhost nova_compute[238069]: 2025-10-14 09:54:36.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:38 localhost python3.9[287331]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 14 05:54:38 localhost openstack_network_exporter[250374]: ERROR 09:54:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:54:38 localhost openstack_network_exporter[250374]: ERROR 09:54:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:54:38 localhost openstack_network_exporter[250374]: ERROR 09:54:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:54:38 localhost openstack_network_exporter[250374]: ERROR 09:54:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:54:38 localhost openstack_network_exporter[250374]: Oct 14 05:54:38 localhost openstack_network_exporter[250374]: ERROR 09:54:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:54:38 localhost openstack_network_exporter[250374]: Oct 14 05:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:54:39 localhost podman[287447]: 2025-10-14 09:54:39.342100145 +0000 UTC m=+0.050636520 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Oct 14 05:54:39 localhost podman[287448]: 2025-10-14 09:54:39.376871606 +0000 UTC m=+0.079907022 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc.) Oct 14 05:54:39 localhost podman[287448]: 2025-10-14 09:54:39.392033563 +0000 UTC m=+0.095069029 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 05:54:39 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:54:39 localhost python3.9[287446]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:39 localhost podman[287447]: 2025-10-14 09:54:39.476069571 +0000 UTC m=+0.184605956 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 05:54:39 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:54:39 localhost nova_compute[238069]: 2025-10-14 09:54:39.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:54:40 localhost podman[287601]: 2025-10-14 09:54:40.688758966 +0000 UTC m=+0.081650735 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2) Oct 14 05:54:40 localhost podman[287601]: 2025-10-14 09:54:40.702032015 +0000 UTC m=+0.094923744 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 05:54:40 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:54:40 localhost python3.9[287600]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:54:40 localhost systemd[1]: Reloading. Oct 14 05:54:41 localhost systemd-rc-local-generator[287645]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:54:41 localhost systemd-sysv-generator[287649]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:54:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:54:41 localhost nova_compute[238069]: 2025-10-14 09:54:41.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:42 localhost python3.9[287762]: ansible-ansible.builtin.service_facts Invoked Oct 14 05:54:42 localhost network[287779]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Oct 14 05:54:42 localhost network[287780]: 'network-scripts' will be removed from distribution in near future. Oct 14 05:54:42 localhost network[287781]: It is advised to switch to 'NetworkManager' instead for network management. Oct 14 05:54:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:54:44 localhost nova_compute[238069]: 2025-10-14 09:54:44.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:46 localhost nova_compute[238069]: 2025-10-14 09:54:46.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:48 localhost nova_compute[238069]: 2025-10-14 09:54:48.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:48 localhost nova_compute[238069]: 2025-10-14 09:54:48.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:48 localhost python3.9[288100]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:48 localhost python3.9[288211]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:49 localhost python3.9[288322]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.815 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.816 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.844 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.845 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b86be2b6-ea45-4ca2-9124-2969b8d20022', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.816879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2efb9a0-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '0907ab33ac01532c0bb92d9dcd1bbdee1fdbe1bd0d25af035a1edd614e91792f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.816879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2efc986-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '85256b2302b3f030aed8ac8df5722674e98c19dff460159866b7d7b373522e52'}]}, 'timestamp': '2025-10-14 09:54:49.845959', '_unique_id': 'f561b7bc70224cd7945e51780c51f472'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.847 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.852 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00c6ffa3-a29a-4e53-a5a3-7a875e88bb53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.848499', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd2f0c70a-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '930e0342ebf4f363337bec7311e010dda9220cdac09b59fc4656c62e4c43d61e'}]}, 'timestamp': '2025-10-14 09:54:49.852453', '_unique_id': 'a6abb8de0e8c4ad1ab1c9280063fc69b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.854 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.873 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.873 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43e304f5-0742-4ac6-80b2-c2688b64879d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.854236', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2f4092e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.046920762, 'message_signature': '7422fa21c4e9c264976e42ff93b40eaf6c3fe7438e65113fb16f7b61bf113c02'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.854236', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2f41630-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.046920762, 'message_signature': '2e851f963b6ab58bfeec597f9429b65e0fceca3503bc6ff3e5bbdafc9a048e14'}]}, 'timestamp': '2025-10-14 09:54:49.874110', '_unique_id': '2816a7c1b9a2417fb35c0697e945f5c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.875 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.875 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efc31ab2-d52d-4b18-b688-7ce6b43cd276', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.875895', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd2f4687e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '6297110c7bd8ac9780df1d35133868f19ff3f4390462d14b353a20ade9a81f55'}]}, 'timestamp': '2025-10-14 09:54:49.876230', '_unique_id': '4a1824280e514a82a7ef61cd6cdfa863'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.877 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c14cffef-28b5-479e-81e8-6ea6e4d0bdcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.877607', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd2f4aac8-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '094b5796138df7dcde157bf8b9dcfa41ba8cf9e3e175bfc2c14b71ef4bea3730'}]}, 'timestamp': '2025-10-14 09:54:49.877926', '_unique_id': 'd2a56f075de944ae874b6a43babb6cd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54a031e3-c48f-4757-8d96-5e1e60798199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.879270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2f4eae2-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '37b99a3992d4020d5ebc8eaf09c1c18dbb418ad456318426e3c511befeaf4a19'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.879270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2f4f654-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '1d03087250ea215165b3aa53eb763f714fbbdc1f6b0749c307d143265b83d079'}]}, 'timestamp': '2025-10-14 09:54:49.879842', '_unique_id': '34e4ea5b80af4759b95b8b8cf760152b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.881 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:54:49 localhost nova_compute[238069]: 2025-10-14 09:54:49.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.941 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96f13dc3-7a5c-49e1-9c55-589195df031c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:54:49.881256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd2fe78aa-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.134222671, 'message_signature': '130add5ee96956e197ea76ee889410db90e708260cb7eca419843c6a975e3c77'}]}, 'timestamp': '2025-10-14 09:54:49.942241', '_unique_id': '075fa03ac7d94ee594f96200dbe5215e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.944 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.944 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 9226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1a89265-c603-44d9-a855-cb64efde33dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9226, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.944307', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd2fed7aa-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': 'af9cb0ae0750efb2dcef977aa26d247a21edde9c227c6299fb428ea774d2f4d5'}]}, 'timestamp': '2025-10-14 09:54:49.944625', '_unique_id': 'adea1189a68e42c1862b600968753b8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.945 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.946 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deab6bdc-975b-4e2b-8d30-fa04eae9d1db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.946149', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd2ff1f6c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '060f096f1c1427c4ddc5a61ab0289f3520186bb4ac1b3d388874ae938d899527'}]}, 'timestamp': '2025-10-14 09:54:49.946458', '_unique_id': 'eb8e42605b7d4e41b720018acfd23853'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.947 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.948 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8370566-0102-414b-b591-0ec602f41bd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.947854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2ff61fc-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.046920762, 'message_signature': '15910911df47f5e3b92177b1d25c5bd63b97b52613e0123868247e1c5a901a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.947854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2ff6dbe-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.046920762, 'message_signature': '22fddd7f42503424ce47253c273b56520449d7d0657e6b4cf3bfc0c32fa4f53b'}]}, 'timestamp': '2025-10-14 09:54:49.948439', '_unique_id': '9aeb6d0b6276412c8517ff605d27fbeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.949 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9fe793f-68b6-42c1-a322-edaf76cca92c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.949905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2ffb1f2-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '8ec63dcc76bfe0f9181b78e31d35e7c35295ca029d5f9f56ea688523e5564e1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.949905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2ffbc38-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '0a963ad188271c99dc33d69db1e3cbebbc95a67afb046b862ed5e26a76408383'}]}, 'timestamp': '2025-10-14 09:54:49.950446', '_unique_id': '9e6d9be871014966bb07c8e21601ff1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b44f9be2-299f-4989-ae47-7fdafcde95b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.951873', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd2ffff04-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': 'e5ed35f9aecf396074e0653639d8add954895cdf0ebdbb040ad9e70282322596'}]}, 'timestamp': '2025-10-14 09:54:49.952209', '_unique_id': 'e2093399f90c4279afffb80ec89c7149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4d6772d-71b7-4cbf-b7cd-afa1369a1009', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.953587', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd30042e8-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '8e00180f5accb77225c41d1410f1c459a470dfc15ccd8eaccf851c0b1e066d00'}]}, 'timestamp': '2025-10-14 09:54:49.953920', '_unique_id': '89b4f528e7b04faba42c251fa8b30233'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.955 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da198536-aa0c-4a7a-ab0d-921c0c602cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.955285', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd3008424-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '26ece711ba63ccb75c1f300bbc4142b2820d014864976e32e6b4581137bae86d'}]}, 'timestamp': '2025-10-14 09:54:49.955585', '_unique_id': '8b309eb64184473baebfd5471b522938'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.957 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.957 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e2860ce-d0d3-4ef7-bca2-80cee84e4ddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.956989', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd300c696-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '417d56afa6207db4e5eeae0373dcaddae75a5e603a639133c4ea824fb2e90c32'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.956989', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd300d190-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '217d3f1235e334e66e83deb4a03849a0871c70752d8c81efcf909cee8d64e7b1'}]}, 'timestamp': '2025-10-14 09:54:49.957551', '_unique_id': 'f279cdaa6d4c4dc291fd86e89afb02f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8f7a87d-f776-4b0e-9d76-c1295e3f89e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.958988', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd301148e-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '81e322ebc7f72aff690736e05db9ca8decd2e10d61c430d3000805bee1fa3068'}]}, 'timestamp': '2025-10-14 09:54:49.959281', '_unique_id': '359531c39b10497683dc26c436d97808'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.959 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.960 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.960 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 65470000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74a38749-61b7-4e80-adb6-53715db2fd50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 65470000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:54:49.960666', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd3015778-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.134222671, 'message_signature': '25273ffee25aed840c8ab57b095e3f1191a3e5c3b6484f33c514b6be99ca9686'}]}, 'timestamp': '2025-10-14 09:54:49.960984', '_unique_id': '3668e319f5514f8a9726080a598e9261'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.961 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.962 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.962 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.962 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '554bdbe2-6f67-4b24-85a2-0e2f1e37810c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.962354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3019800-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.046920762, 'message_signature': 'e000ac8b4dc4a6040b76c36905bee4417471c4d4610f636a1f2e4b421866abe0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.962354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd301a2dc-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.046920762, 'message_signature': '779b910e9bc07abdd6e3b48002cf78ae5ec19e3d3d4965453a0fad316ed9c5cd'}]}, 'timestamp': '2025-10-14 09:54:49.962909', '_unique_id': 'd84fdb55888c4c41891bd513931ce1b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.964 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.964 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0471115-ff91-477a-8056-0157f6950f83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.964364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd301e742-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': 'f662a23a5f91bf391cb2bdd023dfd81ea3e92dcddfb24ceda05d9f1f552eb18b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.964364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd301f32c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': 'b5dd3af047db0f08525d6fc33d130ed1bfa3d8950444b01354e2b3dfcaecd1e8'}]}, 'timestamp': '2025-10-14 09:54:49.964971', '_unique_id': 'eaca168ba483473597278373739e97eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.965 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.966 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.966 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e84e58a-ad39-4a78-b8ed-f0d3da66e6db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:54:49.966444', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd302379c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': '189c9340d6c52911b915d498b153ca0c67eeba937cf48dce4bd33d11b951a8e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:54:49.966444', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd302423c-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.009572612, 'message_signature': 'fd97bacc1335c4c44c55dc6bff98fd2a7845181429f2e736cd1d57dfb5bb96d7'}]}, 'timestamp': '2025-10-14 09:54:49.966983', '_unique_id': 'a6932bf117dc40e196ada107138b097e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.967 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.968 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb684b05-ec1a-4a4f-8f18-85ea2927d815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:54:49.968461', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'd30286a2-a8e3-11f0-9707-fa163e99780b', 'monotonic_time': 11706.041193976, 'message_signature': '4c15d8e49d23f3a797703d5de2aa3ad586b7b56015e38811fc2201ec63a21a9c'}]}, 'timestamp': '2025-10-14 09:54:49.968766', '_unique_id': 'f0fbcd0860a547b68808a449ea773a6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:54:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:54:49.969 12 ERROR oslo_messaging.notify.messaging Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.025 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.026 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:54:50 localhost python3.9[288433]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.882 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.882 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.883 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:54:50 localhost nova_compute[238069]: 2025-10-14 09:54:50.883 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:54:51 localhost python3.9[288544]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:51 localhost nova_compute[238069]: 2025-10-14 09:54:51.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:52 localhost python3.9[288655]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:52 localhost nova_compute[238069]: 2025-10-14 09:54:52.119 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:54:52 localhost nova_compute[238069]: 2025-10-14 09:54:52.135 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:54:52 localhost nova_compute[238069]: 2025-10-14 09:54:52.135 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:54:52 localhost nova_compute[238069]: 2025-10-14 09:54:52.136 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:52 localhost nova_compute[238069]: 2025-10-14 09:54:52.136 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:52 localhost nova_compute[238069]: 2025-10-14 09:54:52.136 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:54:52 localhost systemd[1]: tmp-crun.oRDflm.mount: Deactivated successfully. Oct 14 05:54:52 localhost podman[288768]: 2025-10-14 09:54:52.687532257 +0000 UTC m=+0.121494942 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:54:52 localhost podman[288767]: 2025-10-14 09:54:52.713255889 +0000 UTC m=+0.147356628 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:54:52 localhost podman[288767]: 2025-10-14 09:54:52.754135899 +0000 UTC m=+0.188236678 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:54:52 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:54:52 localhost podman[288768]: 2025-10-14 09:54:52.771181324 +0000 UTC m=+0.205143979 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 05:54:52 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:54:52 localhost python3.9[288766]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:53 localhost nova_compute[238069]: 2025-10-14 09:54:53.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:53 localhost nova_compute[238069]: 2025-10-14 09:54:53.025 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:54:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=991 DF PROTO=TCP SPT=46376 DPT=9102 SEQ=2269981404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1762840000000001030307) Oct 14 05:54:53 localhost python3.9[288919]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:54:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=992 DF PROTO=TCP SPT=46376 DPT=9102 SEQ=2269981404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B17669A0000000001030307) Oct 14 05:54:54 localhost nova_compute[238069]: 2025-10-14 09:54:54.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.063 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.064 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.064 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.064 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.065 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.531 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.617 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.618 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.809 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.810 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11773MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.811 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.811 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.882 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.882 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.883 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:54:55 localhost nova_compute[238069]: 2025-10-14 09:54:55.969 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:54:56 localhost nova_compute[238069]: 2025-10-14 09:54:56.431 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:54:56 localhost nova_compute[238069]: 2025-10-14 09:54:56.438 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:54:56 localhost nova_compute[238069]: 2025-10-14 09:54:56.488 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:54:56 localhost nova_compute[238069]: 2025-10-14 09:54:56.491 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:54:56 localhost nova_compute[238069]: 2025-10-14 09:54:56.492 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:54:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=993 DF PROTO=TCP SPT=46376 DPT=9102 SEQ=2269981404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B176E9B0000000001030307) Oct 14 05:54:56 localhost nova_compute[238069]: 2025-10-14 09:54:56.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:54:57 localhost python3.9[289074]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:54:57.763 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:54:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:54:57.764 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:54:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:54:57.765 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:54:58 localhost python3.9[289184]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:58 localhost podman[248187]: time="2025-10-14T09:54:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:54:58 localhost podman[248187]: @ - - [14/Oct/2025:09:54:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:54:58 localhost podman[248187]: @ - - [14/Oct/2025:09:54:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18341 "" "Go-http-client/1.1" Oct 14 05:54:58 localhost nova_compute[238069]: 2025-10-14 09:54:58.489 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:54:58 localhost python3.9[289294]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:54:59 localhost python3.9[289404]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:00 localhost nova_compute[238069]: 2025-10-14 09:55:00.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:00 localhost python3.9[289514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=994 DF PROTO=TCP SPT=46376 DPT=9102 SEQ=2269981404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B177E5A0000000001030307) Oct 14 05:55:00 localhost python3.9[289624]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:55:01 localhost podman[289735]: 2025-10-14 09:55:01.202979641 +0000 UTC m=+0.084226514 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 05:55:01 localhost podman[289735]: 2025-10-14 09:55:01.213884357 +0000 UTC m=+0.095131290 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:55:01 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:55:01 localhost python3.9[289734]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:01 localhost nova_compute[238069]: 2025-10-14 09:55:01.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:01 localhost python3.9[289863]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:55:02 localhost podman[289974]: 2025-10-14 09:55:02.703077796 +0000 UTC m=+0.046841883 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:55:02 localhost python3.9[289973]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:02 localhost podman[289974]: 2025-10-14 09:55:02.708452032 +0000 UTC m=+0.052216129 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:55:02 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:55:03 localhost python3.9[290106]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:04 localhost python3.9[290216]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:04 localhost python3.9[290326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:05 localhost nova_compute[238069]: 2025-10-14 09:55:05.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:55:05 localhost systemd[1]: tmp-crun.Vo6f42.mount: Deactivated successfully. Oct 14 05:55:05 localhost podman[290437]: 2025-10-14 09:55:05.229073354 +0000 UTC m=+0.074980710 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:55:05 localhost podman[290437]: 2025-10-14 09:55:05.240353952 +0000 UTC m=+0.086261288 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 05:55:05 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:55:05 localhost python3.9[290436]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:05 localhost python3.9[290566]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:06 localhost nova_compute[238069]: 2025-10-14 09:55:06.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:07 localhost python3.9[290676]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:07 localhost python3.9[290786]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:08 localhost openstack_network_exporter[250374]: ERROR 09:55:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:55:08 localhost openstack_network_exporter[250374]: ERROR 09:55:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:55:08 localhost openstack_network_exporter[250374]: ERROR 09:55:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:55:08 localhost openstack_network_exporter[250374]: ERROR 09:55:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:55:08 localhost openstack_network_exporter[250374]: Oct 14 05:55:08 localhost openstack_network_exporter[250374]: ERROR 09:55:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:55:08 localhost openstack_network_exporter[250374]: Oct 14 05:55:09 localhost python3.9[290896]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:55:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:55:09 localhost systemd[1]: tmp-crun.IQ2al5.mount: Deactivated successfully. Oct 14 05:55:09 localhost podman[290932]: 2025-10-14 09:55:09.737029847 +0000 UTC m=+0.073166154 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_controller) Oct 14 05:55:09 localhost podman[290933]: 2025-10-14 09:55:09.801192813 +0000 UTC m=+0.130204230 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 05:55:09 localhost podman[290932]: 2025-10-14 09:55:09.826112011 +0000 UTC m=+0.162248318 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller) Oct 14 05:55:09 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:55:09 localhost podman[290933]: 2025-10-14 09:55:09.838155491 +0000 UTC m=+0.167166878 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350) Oct 14 05:55:09 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:55:10 localhost nova_compute[238069]: 2025-10-14 09:55:10.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:10 localhost python3.9[291050]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Oct 14 05:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:55:11 localhost podman[291160]: 2025-10-14 09:55:11.041133197 +0000 UTC m=+0.079750388 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:55:11 localhost podman[291160]: 2025-10-14 09:55:11.051824726 +0000 UTC m=+0.090441917 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:55:11 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:55:11 localhost python3.9[291161]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Oct 14 05:55:11 localhost systemd[1]: Reloading. Oct 14 05:55:11 localhost systemd-rc-local-generator[291205]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 05:55:11 localhost systemd-sysv-generator[291210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 05:55:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 05:55:11 localhost nova_compute[238069]: 2025-10-14 09:55:11.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:12 localhost python3.9[291326]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:13 localhost python3.9[291437]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:13 localhost python3.9[291548]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:14 localhost python3.9[291659]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:15 localhost python3.9[291770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:15 localhost nova_compute[238069]: 2025-10-14 09:55:15.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:15 localhost python3.9[291881]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:16 localhost python3.9[291992]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:16 localhost nova_compute[238069]: 2025-10-14 09:55:16.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:16 localhost python3.9[292103]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:55:20 localhost nova_compute[238069]: 2025-10-14 09:55:20.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:21 localhost python3.9[292214]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:21 localhost python3.9[292325]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:21 localhost nova_compute[238069]: 2025-10-14 09:55:21.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:55:23 localhost podman[292436]: 2025-10-14 09:55:23.08182615 +0000 UTC m=+0.088462786 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:55:23 localhost podman[292436]: 2025-10-14 09:55:23.090545208 +0000 UTC m=+0.097181854 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:55:23 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:55:23 localhost systemd[1]: tmp-crun.OZ2hNg.mount: Deactivated successfully. Oct 14 05:55:23 localhost podman[292437]: 2025-10-14 09:55:23.18965264 +0000 UTC m=+0.194558522 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:55:23 localhost podman[292437]: 2025-10-14 09:55:23.197069699 +0000 UTC m=+0.201975611 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:55:23 localhost python3.9[292435]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:23 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:55:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10353 DF PROTO=TCP SPT=57690 DPT=9102 SEQ=185111240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B17D7B50000000001030307) Oct 14 05:55:23 localhost python3.9[292583]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10354 DF PROTO=TCP SPT=57690 DPT=9102 SEQ=185111240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B17DBDB0000000001030307) Oct 14 05:55:24 localhost python3.9[292693]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:25 localhost nova_compute[238069]: 2025-10-14 09:55:25.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:25 localhost python3.9[292803]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:25 localhost python3.9[292913]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:26 localhost python3.9[293023]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10355 DF PROTO=TCP SPT=57690 DPT=9102 SEQ=185111240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B17E3DA0000000001030307) Oct 14 05:55:26 localhost nova_compute[238069]: 2025-10-14 09:55:26.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:27 localhost python3.9[293133]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:27 localhost python3.9[293243]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:28 localhost podman[248187]: time="2025-10-14T09:55:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:55:28 localhost podman[248187]: @ - - [14/Oct/2025:09:55:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:55:28 localhost podman[248187]: @ - - [14/Oct/2025:09:55:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18349 "" "Go-http-client/1.1" Oct 14 05:55:28 localhost python3.9[293353]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:29 localhost python3.9[293463]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:30 localhost nova_compute[238069]: 2025-10-14 09:55:30.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10356 DF PROTO=TCP SPT=57690 DPT=9102 SEQ=185111240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B17F39A0000000001030307) Oct 14 05:55:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:55:31 localhost podman[293481]: 2025-10-14 09:55:31.737655437 +0000 UTC m=+0.084832963 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:55:31 localhost podman[293481]: 2025-10-14 09:55:31.751341399 +0000 UTC m=+0.098518955 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:55:31 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:55:31 localhost nova_compute[238069]: 2025-10-14 09:55:31.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:55:33 localhost podman[293498]: 2025-10-14 09:55:33.73870497 +0000 UTC m=+0.083047869 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:55:33 localhost podman[293498]: 2025-10-14 09:55:33.750009518 +0000 UTC m=+0.094352367 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:55:33 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:55:35 localhost python3.9[293613]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Oct 14 05:55:35 localhost nova_compute[238069]: 2025-10-14 09:55:35.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:55:35 localhost podman[293632]: 2025-10-14 09:55:35.743113007 +0000 UTC m=+0.080573423 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:55:35 localhost podman[293632]: 2025-10-14 09:55:35.785185392 +0000 UTC m=+0.122645808 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS) Oct 14 05:55:35 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:55:36 localhost sshd[293651]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:55:36 localhost systemd-logind[760]: New session 63 of user zuul. Oct 14 05:55:36 localhost systemd[1]: Started Session 63 of User zuul. Oct 14 05:55:36 localhost systemd[1]: session-63.scope: Deactivated successfully. Oct 14 05:55:36 localhost systemd-logind[760]: Session 63 logged out. Waiting for processes to exit. Oct 14 05:55:36 localhost systemd-logind[760]: Removed session 63. Oct 14 05:55:36 localhost nova_compute[238069]: 2025-10-14 09:55:36.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:37 localhost python3.9[293762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:37 localhost python3.9[293848]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435736.7591038-3919-212469066947968/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:38 localhost python3.9[293956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:38 localhost openstack_network_exporter[250374]: ERROR 09:55:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:55:38 localhost openstack_network_exporter[250374]: ERROR 09:55:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:55:38 localhost openstack_network_exporter[250374]: ERROR 09:55:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:55:38 localhost openstack_network_exporter[250374]: ERROR 09:55:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:55:38 localhost openstack_network_exporter[250374]: Oct 14 05:55:38 localhost openstack_network_exporter[250374]: ERROR 09:55:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:55:38 localhost openstack_network_exporter[250374]: Oct 14 05:55:39 localhost python3.9[294011]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:39 localhost python3.9[294119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:40 localhost nova_compute[238069]: 2025-10-14 09:55:40.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:40 localhost python3.9[294205]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435739.2033973-3919-235027647537829/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:55:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:55:40 localhost systemd[1]: tmp-crun.NsHDWs.mount: Deactivated successfully. Oct 14 05:55:40 localhost podman[294206]: 2025-10-14 09:55:40.41381508 +0000 UTC m=+0.081454529 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible) Oct 14 05:55:40 localhost podman[294207]: 2025-10-14 09:55:40.447145317 +0000 UTC m=+0.115100826 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64) Oct 14 05:55:40 localhost podman[294207]: 2025-10-14 09:55:40.463523611 +0000 UTC m=+0.131479100 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm) Oct 14 05:55:40 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:55:40 localhost podman[294206]: 2025-10-14 09:55:40.48035669 +0000 UTC m=+0.147996109 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Oct 14 05:55:40 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:55:41 localhost python3.9[294360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:55:41 localhost podman[294361]: 2025-10-14 09:55:41.724604096 +0000 UTC m=+0.071155223 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Oct 14 05:55:41 localhost podman[294361]: 2025-10-14 09:55:41.734954024 +0000 UTC m=+0.081505111 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Oct 14 05:55:41 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:55:41 localhost nova_compute[238069]: 2025-10-14 09:55:41.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:42 localhost python3.9[294465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435741.077871-3919-154151274595970/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=0ec8d5fb830c2e963175e9158df8fb7429fe888d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:44 localhost python3.9[294589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:44 localhost python3.9[294712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435742.654768-3919-253338264696981/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:45 localhost nova_compute[238069]: 2025-10-14 09:55:45.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:45 localhost python3.9[294839]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:46 localhost nova_compute[238069]: 2025-10-14 09:55:46.019 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:46 localhost python3.9[294949]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:46 localhost nova_compute[238069]: 2025-10-14 09:55:46.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:47 localhost python3.9[295059]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:55:47 localhost python3.9[295171]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:55:48 localhost python3.9[295297]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:55:49 localhost python3.9[295407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:49 localhost python3.9[295462]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:50 localhost python3.9[295570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.945 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.945 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.945 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:55:50 localhost nova_compute[238069]: 2025-10-14 09:55:50.945 2 DEBUG nova.objects.instance [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:55:51 localhost python3.9[295625]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:52 localhost python3.9[295735]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.292 2 DEBUG nova.network.neutron [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.320 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.320 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.321 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.322 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:52 localhost nova_compute[238069]: 2025-10-14 09:55:52.323 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:52 localhost python3.9[295845]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:55:53 localhost nova_compute[238069]: 2025-10-14 09:55:53.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:53 localhost nova_compute[238069]: 2025-10-14 09:55:53.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23255 DF PROTO=TCP SPT=52354 DPT=9102 SEQ=4186257601 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B184CE50000000001030307) Oct 14 05:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:55:53 localhost systemd[1]: tmp-crun.AYzAdr.mount: Deactivated successfully. Oct 14 05:55:53 localhost podman[295957]: 2025-10-14 09:55:53.665647882 +0000 UTC m=+0.077290580 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent) Oct 14 05:55:53 localhost podman[295957]: 2025-10-14 09:55:53.701171747 +0000 UTC m=+0.112814485 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:55:53 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:55:53 localhost systemd[1]: tmp-crun.qMFewA.mount: Deactivated successfully. Oct 14 05:55:53 localhost podman[295956]: 2025-10-14 09:55:53.727123855 +0000 UTC m=+0.139882159 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:55:53 localhost podman[295956]: 2025-10-14 09:55:53.739152077 +0000 UTC m=+0.151910390 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:55:53 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:55:53 localhost python3[295955]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:55:54 localhost python3[295955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "b5b57d3572ac74b7c41332c066527d5039dbd47e134e43d7cb5d76b7732d99f5",#012 "Digest": "sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-13T12:50:19.385564198Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1207014273,#012 "VirtualSize": 1207014273,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",#012 "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",#012 "sha256:e0ba9b00dd1340fa4eba9e9cd5f316c11381d47a31460e5b834a6ca56f60033f",#012 "sha256:731e9354c974a424a2f6724faa85f84baef270eb006be0de18bbdc87ff420f97"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-13T12:28:42.843286399Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843354051Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843394192Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843417133Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843442193Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843461914Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:43.236856724Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:29:17.539596691Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Oct 14 05:55:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23256 DF PROTO=TCP SPT=52354 DPT=9102 SEQ=4186257601 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1850DA0000000001030307) Oct 14 05:55:54 localhost python3.9[296171]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:55:55 localhost nova_compute[238069]: 2025-10-14 09:55:55.024 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:55 localhost nova_compute[238069]: 2025-10-14 09:55:55.024 2 DEBUG nova.compute.manager [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:55:55 localhost nova_compute[238069]: 2025-10-14 09:55:55.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:56 localhost python3.9[296283]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Oct 14 05:55:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23257 DF PROTO=TCP SPT=52354 DPT=9102 SEQ=4186257601 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1858DA0000000001030307) Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.023 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.057 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.057 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.057 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.058 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.058 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:55:57 localhost python3.9[296393]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.491 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.559 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.561 2 DEBUG nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:55:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:55:57.764 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:55:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:55:57.765 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:55:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:55:57.766 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.776 2 WARNING nova.virt.libvirt.driver [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.777 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11765MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.778 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.779 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.852 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.853 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.853 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:55:57 localhost nova_compute[238069]: 2025-10-14 09:55:57.908 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:55:58 localhost python3[296525]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Oct 14 05:55:58 localhost python3[296525]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "b5b57d3572ac74b7c41332c066527d5039dbd47e134e43d7cb5d76b7732d99f5",#012 "Digest": "sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-10-13T12:50:19.385564198Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1207014273,#012 "VirtualSize": 1207014273,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",#012 "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",#012 "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",#012 "sha256:e0ba9b00dd1340fa4eba9e9cd5f316c11381d47a31460e5b834a6ca56f60033f",#012 "sha256:731e9354c974a424a2f6724faa85f84baef270eb006be0de18bbdc87ff420f97"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251009",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-10-09T00:18:03.867908726Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:03.868015697Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251009\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-09T00:18:07.890794359Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-10-13T12:28:42.843286399Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843354051Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843394192Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843417133Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843442193Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:42.843461914Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:28:43.236856724Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-10-13T12:29:17.539596691Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Oct 14 05:55:58 localhost podman[248187]: time="2025-10-14T09:55:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:55:58 localhost podman[248187]: @ - - [14/Oct/2025:09:55:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:55:58 localhost podman[248187]: @ - - [14/Oct/2025:09:55:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18352 "" "Go-http-client/1.1" Oct 14 05:55:58 localhost nova_compute[238069]: 2025-10-14 09:55:58.391 2 DEBUG oslo_concurrency.processutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:55:58 localhost nova_compute[238069]: 2025-10-14 09:55:58.396 2 DEBUG nova.compute.provider_tree [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:55:58 localhost nova_compute[238069]: 2025-10-14 09:55:58.411 2 DEBUG nova.scheduler.client.report [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:55:58 localhost nova_compute[238069]: 2025-10-14 09:55:58.412 2 DEBUG nova.compute.resource_tracker [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:55:58 localhost nova_compute[238069]: 2025-10-14 09:55:58.412 2 DEBUG oslo_concurrency.lockutils [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:55:59 localhost python3.9[296720]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:56:00 localhost python3.9[296832]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:56:00 localhost nova_compute[238069]: 2025-10-14 09:56:00.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:00 localhost nova_compute[238069]: 2025-10-14 09:56:00.409 2 DEBUG oslo_service.periodic_task [None req-6724c12e-f66e-42db-9277-35ea9ecdd8c1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:56:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23258 DF PROTO=TCP SPT=52354 DPT=9102 SEQ=4186257601 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B18689A0000000001030307) Oct 14 05:56:00 localhost python3.9[296941]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435760.185972-4554-156036735472379/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:56:01 localhost python3.9[296996]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Oct 14 05:56:02 localhost nova_compute[238069]: 2025-10-14 09:56:02.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:56:02 localhost podman[297016]: 2025-10-14 09:56:02.760471638 +0000 UTC m=+0.098139473 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:56:02 localhost podman[297016]: 2025-10-14 09:56:02.771058304 +0000 UTC m=+0.108726129 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:56:02 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:56:03 localhost python3.9[297125]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:56:04 localhost python3.9[297233]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:56:04 localhost podman[297234]: 2025-10-14 09:56:04.74335254 +0000 UTC m=+0.081123079 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:56:04 localhost podman[297234]: 2025-10-14 09:56:04.752918584 +0000 UTC m=+0.090689163 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:56:04 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:56:05 localhost nova_compute[238069]: 2025-10-14 09:56:05.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:05 localhost python3.9[297364]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 14 05:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:56:06 localhost podman[297420]: 2025-10-14 09:56:06.739197852 +0000 UTC m=+0.079969674 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:56:06 localhost podman[297420]: 2025-10-14 09:56:06.750011685 +0000 UTC m=+0.090783487 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd) Oct 14 05:56:06 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:56:07 localhost nova_compute[238069]: 2025-10-14 09:56:07.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:07 localhost python3.9[297492]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Oct 14 05:56:07 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 105.1 (350 of 333 items), suggesting rotation. Oct 14 05:56:07 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:56:07 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:56:07 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:56:07 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:56:08 localhost python3.9[297627]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:56:08 localhost systemd[1]: Stopping nova_compute container... Oct 14 05:56:08 localhost openstack_network_exporter[250374]: ERROR 09:56:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:56:08 localhost openstack_network_exporter[250374]: ERROR 09:56:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:56:08 localhost openstack_network_exporter[250374]: Oct 14 05:56:08 localhost openstack_network_exporter[250374]: ERROR 09:56:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:56:08 localhost openstack_network_exporter[250374]: ERROR 09:56:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:56:08 localhost openstack_network_exporter[250374]: ERROR 09:56:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:56:08 localhost openstack_network_exporter[250374]: Oct 14 05:56:09 localhost nova_compute[238069]: 2025-10-14 09:56:09.434 2 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Oct 14 05:56:09 localhost nova_compute[238069]: 2025-10-14 09:56:09.436 2 DEBUG oslo_concurrency.lockutils [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:56:09 localhost nova_compute[238069]: 2025-10-14 09:56:09.436 2 DEBUG oslo_concurrency.lockutils [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:56:09 localhost nova_compute[238069]: 2025-10-14 09:56:09.437 2 DEBUG oslo_concurrency.lockutils [None req-dd525213-dec9-4431-b417-bf2680a8c04b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:56:09 localhost journal[206742]: End of file while reading data: Input/output error Oct 14 05:56:09 localhost systemd[1]: libpod-b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048.scope: Deactivated successfully. Oct 14 05:56:09 localhost systemd[1]: libpod-b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048.scope: Consumed 22.832s CPU time. Oct 14 05:56:09 localhost podman[297631]: 2025-10-14 09:56:09.8396368 +0000 UTC m=+1.129749661 container died b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 05:56:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048-userdata-shm.mount: Deactivated successfully. Oct 14 05:56:09 localhost systemd[1]: var-lib-containers-storage-overlay-533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940-merged.mount: Deactivated successfully. Oct 14 05:56:09 localhost podman[297631]: 2025-10-14 09:56:09.999646118 +0000 UTC m=+1.289758959 container cleanup b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, config_id=edpm, org.label-schema.schema-version=1.0) Oct 14 05:56:10 localhost podman[297631]: nova_compute Oct 14 05:56:10 localhost podman[297643]: 2025-10-14 09:56:10.002617719 +0000 UTC m=+0.157550722 container cleanup b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute) Oct 14 05:56:10 localhost podman[297658]: 2025-10-14 09:56:10.10070724 +0000 UTC m=+0.055848421 container cleanup b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 05:56:10 localhost podman[297658]: nova_compute Oct 14 05:56:10 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Oct 14 05:56:10 localhost systemd[1]: Stopped nova_compute container. Oct 14 05:56:10 localhost systemd[1]: Starting nova_compute container... Oct 14 05:56:10 localhost systemd[1]: Started libcrun container. Oct 14 05:56:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533fa5474117d9c6b7dd39987146fbd30daa3403ff237c50bdb261ff835bc940/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:10 localhost podman[297671]: 2025-10-14 09:56:10.250053899 +0000 UTC m=+0.117528191 container init b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 05:56:10 localhost podman[297671]: 2025-10-14 09:56:10.259777999 +0000 UTC m=+0.127252301 container start b5bca3e82ce7d7a3523a446d3f76940f7a8c2f4ab9ab6051e2ae66a8045d2048 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:56:10 localhost podman[297671]: nova_compute Oct 14 05:56:10 localhost nova_compute[297686]: + sudo -E kolla_set_configs Oct 14 05:56:10 localhost systemd[1]: Started nova_compute container. Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Validating config file Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying service configuration files Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/nova/nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/nova/nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /etc/ceph Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Creating directory /etc/ceph Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/ceph Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Writing out command to execute Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:56:10 localhost nova_compute[297686]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Oct 14 05:56:10 localhost nova_compute[297686]: ++ cat /run_command Oct 14 05:56:10 localhost nova_compute[297686]: + CMD=nova-compute Oct 14 05:56:10 localhost nova_compute[297686]: + ARGS= Oct 14 05:56:10 localhost nova_compute[297686]: + sudo kolla_copy_cacerts Oct 14 05:56:10 localhost nova_compute[297686]: + [[ ! -n '' ]] Oct 14 05:56:10 localhost nova_compute[297686]: + . kolla_extend_start Oct 14 05:56:10 localhost nova_compute[297686]: + echo 'Running command: '\''nova-compute'\''' Oct 14 05:56:10 localhost nova_compute[297686]: Running command: 'nova-compute' Oct 14 05:56:10 localhost nova_compute[297686]: + umask 0022 Oct 14 05:56:10 localhost nova_compute[297686]: + exec nova-compute Oct 14 05:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:56:10 localhost podman[297715]: 2025-10-14 09:56:10.728906905 +0000 UTC m=+0.067529780 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:56:10 localhost podman[297715]: 2025-10-14 09:56:10.761730866 +0000 UTC m=+0.100353711 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:56:10 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:56:10 localhost podman[297716]: 2025-10-14 09:56:10.84823397 +0000 UTC m=+0.180693226 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 05:56:10 localhost podman[297716]: 2025-10-14 09:56:10.863113188 +0000 UTC m=+0.195572494 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7) Oct 14 05:56:10 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.109 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.109 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.110 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.110 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.225 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.246 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:56:12 localhost systemd[1]: tmp-crun.FsNbkx.mount: Deactivated successfully. Oct 14 05:56:12 localhost podman[297766]: 2025-10-14 09:56:12.740879875 +0000 UTC m=+0.085261937 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Oct 14 05:56:12 localhost podman[297766]: 2025-10-14 09:56:12.77647148 +0000 UTC m=+0.120853532 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 05:56:12 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.809 2 INFO nova.virt.driver [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.936 2 INFO nova.compute.provider_config [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.944 2 DEBUG oslo_concurrency.lockutils [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.945 2 DEBUG oslo_concurrency.lockutils [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.945 2 DEBUG oslo_concurrency.lockutils [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.945 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.946 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.946 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.946 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.947 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.947 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.947 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.947 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.947 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.948 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.948 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.948 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.948 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.949 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.949 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.949 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.949 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.949 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.950 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.950 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] console_host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.950 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.951 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.951 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.951 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.951 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.951 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.952 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.952 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.952 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.952 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.953 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.953 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.953 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.953 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.954 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.954 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.954 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.954 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.955 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] host = np0005486733.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.955 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.955 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.955 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.956 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.956 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.956 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.956 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.957 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.957 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.957 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.957 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.958 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.958 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.958 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.958 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.959 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.959 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.959 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.959 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.959 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.960 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.960 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.960 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.960 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.961 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.961 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.961 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.961 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.962 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.962 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.962 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.962 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.962 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.963 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.963 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.963 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.963 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.964 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.964 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.964 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.964 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.965 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.965 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.965 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.965 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.965 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.966 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.966 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.966 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.966 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.967 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.967 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.967 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.967 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.968 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.968 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.968 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.968 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.968 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.969 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.969 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.969 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.970 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.970 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.970 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.970 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.970 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.971 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.971 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.971 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.971 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.972 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.972 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.972 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.972 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.973 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.973 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.973 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.973 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.973 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.974 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.974 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.974 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.974 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.975 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.975 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.975 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.975 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.975 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.976 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.976 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.976 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.976 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.977 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.977 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.977 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.977 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.977 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.978 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.978 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.978 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.978 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.979 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.979 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.979 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.979 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.980 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.980 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.980 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.980 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.981 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.981 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.981 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.981 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.982 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.982 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.982 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.982 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.982 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.983 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.983 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.983 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.983 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.983 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.984 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.984 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.984 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.984 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.985 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.985 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.985 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.985 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.985 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.986 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.986 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.986 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.986 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.986 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.987 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.987 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.987 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.987 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.987 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.988 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.988 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.988 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.988 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.988 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.989 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.989 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.989 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.989 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.989 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.990 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.990 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.990 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.990 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.990 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.991 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.991 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.991 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.991 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.991 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.992 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.992 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.992 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.992 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.992 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.993 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.993 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.993 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.993 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.993 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.994 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.994 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.994 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.994 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.994 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.995 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.995 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.995 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.995 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.995 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.996 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.996 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.996 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.996 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.996 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.997 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.997 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.997 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.997 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.997 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.998 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.998 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.998 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.998 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.999 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.999 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.999 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:12 localhost nova_compute[297686]: 2025-10-14 09:56:12.999 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:12.999 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.000 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.000 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.000 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.000 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.001 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.001 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.001 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.001 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.001 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.002 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.002 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.002 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.002 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.003 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.003 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.003 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.003 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.003 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.004 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.004 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.004 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.004 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.004 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.005 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.005 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.005 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.005 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.005 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.006 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.006 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.006 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.006 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.006 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.007 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.007 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.007 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.007 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.008 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.008 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.008 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.008 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.008 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.009 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.009 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.009 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.009 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.009 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.010 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.010 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.010 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.010 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.010 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.011 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.011 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.011 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.011 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.012 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.012 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.012 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.012 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.012 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.013 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.013 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.013 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.013 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.013 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.014 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.014 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.014 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.014 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.015 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.015 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.015 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.015 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.015 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.016 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.016 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.016 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.016 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.016 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.017 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.017 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.017 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.017 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.017 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.018 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.018 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.018 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.018 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.019 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.019 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.019 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.019 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.019 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.020 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.020 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.020 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.020 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.021 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.021 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.021 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.021 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.021 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.022 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.022 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.022 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.022 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.022 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.023 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.023 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.023 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.023 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.023 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.024 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.024 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.024 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.025 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.025 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.025 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.025 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.025 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.026 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.026 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.026 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.026 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.027 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.027 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.028 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.028 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.028 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.028 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.028 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.029 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.029 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.029 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.029 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.029 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.030 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.030 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.030 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.030 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.030 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.031 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.031 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.031 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.031 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.031 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.032 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.032 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.032 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.032 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.033 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.033 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.033 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.033 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.033 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.034 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.034 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.034 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.034 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.034 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.035 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.035 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.035 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.035 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.035 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.036 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.036 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.036 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.036 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.036 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.037 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.037 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.037 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.037 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.037 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.038 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.038 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.038 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.038 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.038 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.039 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.039 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.039 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.039 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.039 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.040 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.040 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.040 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.040 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.040 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.041 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.041 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.041 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.041 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.041 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.042 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.042 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.042 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.042 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.042 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.043 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.043 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.043 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.043 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.043 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.044 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.044 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.044 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.044 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.044 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.045 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.045 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.045 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.045 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.045 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.046 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.046 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.046 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.046 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.046 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.047 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.047 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.047 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.047 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.047 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.048 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.049 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.050 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.051 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.051 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.051 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.051 2 WARNING oslo_config.cfg [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Oct 14 05:56:13 localhost nova_compute[297686]: live_migration_uri is deprecated for removal in favor of two other options that Oct 14 05:56:13 localhost nova_compute[297686]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Oct 14 05:56:13 localhost nova_compute[297686]: and ``live_migration_inbound_addr`` respectively. Oct 14 05:56:13 localhost nova_compute[297686]: ). Its value may be silently ignored in the future.#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.051 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.051 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.052 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.053 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rbd_secret_uuid = fcadf6e2-9176-5818-a8d0-37b19acf8eaf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.054 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.055 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.056 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.057 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.057 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.057 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.057 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.057 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.057 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.058 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.059 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.060 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.061 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.062 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.063 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.064 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.065 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.066 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.067 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.068 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.068 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.068 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.068 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.068 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.068 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.069 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.070 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.071 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.072 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.072 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.072 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.072 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.072 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.072 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.073 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.074 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.075 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.076 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.077 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.078 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.078 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.078 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.078 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.078 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.078 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.079 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.080 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.080 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.080 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.080 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.080 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.080 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.081 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.082 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.083 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.084 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.085 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.086 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.087 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.088 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.088 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.088 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.088 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.088 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.088 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.089 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.090 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.091 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.092 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.093 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.094 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.095 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.096 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.097 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.098 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.099 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.100 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.100 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.100 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.100 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.100 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.100 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.101 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.102 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.103 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.104 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.105 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.106 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.107 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.108 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.109 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.110 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.111 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.112 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.112 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.112 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.112 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.112 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.112 2 DEBUG oslo_service.service [None req-cfec8768-a6b3-475a-a67c-759e73f9b3b3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.113 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.126 2 INFO nova.virt.node [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Determined node identity 18c24273-aca2-4f08-be57-3188d558235e from /var/lib/nova/compute_id#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.127 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.127 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.128 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.128 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.138 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.140 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.141 2 INFO nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Connection event '1' reason 'None'#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.144 2 INFO nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Libvirt host capabilities Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 1e17686e-e9d9-4f56-ae5b-e175ec048439 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: x86_64 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v4 Oct 14 05:56:13 localhost nova_compute[297686]: AMD Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: tcp Oct 14 05:56:13 localhost nova_compute[297686]: rdma Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 16116612 Oct 14 05:56:13 localhost nova_compute[297686]: 4029153 Oct 14 05:56:13 localhost nova_compute[297686]: 0 Oct 14 05:56:13 localhost nova_compute[297686]: 0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: selinux Oct 14 05:56:13 localhost nova_compute[297686]: 0 Oct 14 05:56:13 localhost nova_compute[297686]: system_u:system_r:svirt_t:s0 Oct 14 05:56:13 localhost nova_compute[297686]: system_u:system_r:svirt_tcg_t:s0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: dac Oct 14 05:56:13 localhost nova_compute[297686]: 0 Oct 14 05:56:13 localhost nova_compute[297686]: +107:+107 Oct 14 05:56:13 localhost nova_compute[297686]: +107:+107 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: hvm Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 32 Oct 14 05:56:13 localhost nova_compute[297686]: /usr/libexec/qemu-kvm Oct 14 05:56:13 localhost nova_compute[297686]: pc-i440fx-rhel7.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: q35 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.4.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.5.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.3.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel7.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.4.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.2.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.2.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.0.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.0.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.1.0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: hvm Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 64 Oct 14 05:56:13 localhost nova_compute[297686]: /usr/libexec/qemu-kvm Oct 14 05:56:13 localhost nova_compute[297686]: pc-i440fx-rhel7.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: q35 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.4.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.5.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.3.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel7.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.4.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.2.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.2.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.0.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.0.0 Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel8.1.0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: #033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.151 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.155 2 DEBUG nova.virt.libvirt.volume.mount [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.158 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/libexec/qemu-kvm Oct 14 05:56:13 localhost nova_compute[297686]: kvm Oct 14 05:56:13 localhost nova_compute[297686]: pc-i440fx-rhel7.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: i686 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: rom Oct 14 05:56:13 localhost nova_compute[297686]: pflash Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: yes Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: AMD Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 486 Oct 14 05:56:13 localhost nova_compute[297686]: 486-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Conroe Oct 14 05:56:13 localhost nova_compute[297686]: Conroe-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-IBPB Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v4 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v1 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v2 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v6 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v7 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Penryn Oct 14 05:56:13 localhost nova_compute[297686]: Penryn-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Westmere Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v2 Oct 14 05:56:13 localhost nova_compute[297686]: athlon Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: athlon-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: kvm32 Oct 14 05:56:13 localhost nova_compute[297686]: kvm32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: n270 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: n270-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pentium Oct 14 05:56:13 localhost nova_compute[297686]: pentium-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: phenom Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: phenom-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu32 Oct 14 05:56:13 localhost nova_compute[297686]: qemu32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: file Oct 14 05:56:13 localhost nova_compute[297686]: anonymous Oct 14 05:56:13 localhost nova_compute[297686]: memfd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: disk Oct 14 05:56:13 localhost nova_compute[297686]: cdrom Oct 14 05:56:13 localhost nova_compute[297686]: floppy Oct 14 05:56:13 localhost nova_compute[297686]: lun Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: ide Oct 14 05:56:13 localhost nova_compute[297686]: fdc Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: sata Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: vnc Oct 14 05:56:13 localhost nova_compute[297686]: egl-headless Oct 14 05:56:13 localhost nova_compute[297686]: dbus Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: subsystem Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: mandatory Oct 14 05:56:13 localhost nova_compute[297686]: requisite Oct 14 05:56:13 localhost nova_compute[297686]: optional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: pci Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: random Oct 14 05:56:13 localhost nova_compute[297686]: egd Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: path Oct 14 05:56:13 localhost nova_compute[297686]: handle Oct 14 05:56:13 localhost nova_compute[297686]: virtiofs Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: tpm-tis Oct 14 05:56:13 localhost nova_compute[297686]: tpm-crb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: emulator Oct 14 05:56:13 localhost nova_compute[297686]: external Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 2.0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pty Oct 14 05:56:13 localhost nova_compute[297686]: unix Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: passt Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: isa Oct 14 05:56:13 localhost nova_compute[297686]: hyperv Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: relaxed Oct 14 05:56:13 localhost nova_compute[297686]: vapic Oct 14 05:56:13 localhost nova_compute[297686]: spinlocks Oct 14 05:56:13 localhost nova_compute[297686]: vpindex Oct 14 05:56:13 localhost nova_compute[297686]: runtime Oct 14 05:56:13 localhost nova_compute[297686]: synic Oct 14 05:56:13 localhost nova_compute[297686]: stimer Oct 14 05:56:13 localhost nova_compute[297686]: reset Oct 14 05:56:13 localhost nova_compute[297686]: vendor_id Oct 14 05:56:13 localhost nova_compute[297686]: frequencies Oct 14 05:56:13 localhost nova_compute[297686]: reenlightenment Oct 14 05:56:13 localhost nova_compute[297686]: tlbflush Oct 14 05:56:13 localhost nova_compute[297686]: ipi Oct 14 05:56:13 localhost nova_compute[297686]: avic Oct 14 05:56:13 localhost nova_compute[297686]: emsr_bitmap Oct 14 05:56:13 localhost nova_compute[297686]: xmm_input Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.165 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/libexec/qemu-kvm Oct 14 05:56:13 localhost nova_compute[297686]: kvm Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: i686 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: rom Oct 14 05:56:13 localhost nova_compute[297686]: pflash Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: yes Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: AMD Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 486 Oct 14 05:56:13 localhost nova_compute[297686]: 486-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Conroe Oct 14 05:56:13 localhost nova_compute[297686]: Conroe-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-IBPB Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v4 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v1 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v2 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v6 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v7 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Penryn Oct 14 05:56:13 localhost nova_compute[297686]: Penryn-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Westmere Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v2 Oct 14 05:56:13 localhost nova_compute[297686]: athlon Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: athlon-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: kvm32 Oct 14 05:56:13 localhost nova_compute[297686]: kvm32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: n270 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: n270-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pentium Oct 14 05:56:13 localhost nova_compute[297686]: pentium-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: phenom Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: phenom-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu32 Oct 14 05:56:13 localhost nova_compute[297686]: qemu32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: file Oct 14 05:56:13 localhost nova_compute[297686]: anonymous Oct 14 05:56:13 localhost nova_compute[297686]: memfd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: disk Oct 14 05:56:13 localhost nova_compute[297686]: cdrom Oct 14 05:56:13 localhost nova_compute[297686]: floppy Oct 14 05:56:13 localhost nova_compute[297686]: lun Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: fdc Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: sata Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: vnc Oct 14 05:56:13 localhost nova_compute[297686]: egl-headless Oct 14 05:56:13 localhost nova_compute[297686]: dbus Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: subsystem Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: mandatory Oct 14 05:56:13 localhost nova_compute[297686]: requisite Oct 14 05:56:13 localhost nova_compute[297686]: optional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: pci Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: random Oct 14 05:56:13 localhost nova_compute[297686]: egd Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: path Oct 14 05:56:13 localhost nova_compute[297686]: handle Oct 14 05:56:13 localhost nova_compute[297686]: virtiofs Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: tpm-tis Oct 14 05:56:13 localhost nova_compute[297686]: tpm-crb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: emulator Oct 14 05:56:13 localhost nova_compute[297686]: external Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 2.0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pty Oct 14 05:56:13 localhost nova_compute[297686]: unix Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: passt Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: isa Oct 14 05:56:13 localhost nova_compute[297686]: hyperv Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: relaxed Oct 14 05:56:13 localhost nova_compute[297686]: vapic Oct 14 05:56:13 localhost nova_compute[297686]: spinlocks Oct 14 05:56:13 localhost nova_compute[297686]: vpindex Oct 14 05:56:13 localhost nova_compute[297686]: runtime Oct 14 05:56:13 localhost nova_compute[297686]: synic Oct 14 05:56:13 localhost nova_compute[297686]: stimer Oct 14 05:56:13 localhost nova_compute[297686]: reset Oct 14 05:56:13 localhost nova_compute[297686]: vendor_id Oct 14 05:56:13 localhost nova_compute[297686]: frequencies Oct 14 05:56:13 localhost nova_compute[297686]: reenlightenment Oct 14 05:56:13 localhost nova_compute[297686]: tlbflush Oct 14 05:56:13 localhost nova_compute[297686]: ipi Oct 14 05:56:13 localhost nova_compute[297686]: avic Oct 14 05:56:13 localhost nova_compute[297686]: emsr_bitmap Oct 14 05:56:13 localhost nova_compute[297686]: xmm_input Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.217 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.222 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/libexec/qemu-kvm Oct 14 05:56:13 localhost nova_compute[297686]: kvm Oct 14 05:56:13 localhost nova_compute[297686]: pc-q35-rhel9.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: x86_64 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: efi Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/edk2/ovmf/OVMF_CODE.fd Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: rom Oct 14 05:56:13 localhost nova_compute[297686]: pflash Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: yes Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: yes Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: AMD Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 486 Oct 14 05:56:13 localhost nova_compute[297686]: 486-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Conroe Oct 14 05:56:13 localhost nova_compute[297686]: Conroe-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-IBPB Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v4 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v1 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v2 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v6 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v7 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Penryn Oct 14 05:56:13 localhost nova_compute[297686]: Penryn-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Westmere Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v2 Oct 14 05:56:13 localhost nova_compute[297686]: athlon Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: athlon-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: kvm32 Oct 14 05:56:13 localhost nova_compute[297686]: kvm32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: n270 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: n270-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pentium Oct 14 05:56:13 localhost nova_compute[297686]: pentium-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: phenom Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: phenom-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu32 Oct 14 05:56:13 localhost nova_compute[297686]: qemu32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: file Oct 14 05:56:13 localhost nova_compute[297686]: anonymous Oct 14 05:56:13 localhost nova_compute[297686]: memfd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: disk Oct 14 05:56:13 localhost nova_compute[297686]: cdrom Oct 14 05:56:13 localhost nova_compute[297686]: floppy Oct 14 05:56:13 localhost nova_compute[297686]: lun Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: fdc Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: sata Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: vnc Oct 14 05:56:13 localhost nova_compute[297686]: egl-headless Oct 14 05:56:13 localhost nova_compute[297686]: dbus Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: subsystem Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: mandatory Oct 14 05:56:13 localhost nova_compute[297686]: requisite Oct 14 05:56:13 localhost nova_compute[297686]: optional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: pci Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: random Oct 14 05:56:13 localhost nova_compute[297686]: egd Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: path Oct 14 05:56:13 localhost nova_compute[297686]: handle Oct 14 05:56:13 localhost nova_compute[297686]: virtiofs Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: tpm-tis Oct 14 05:56:13 localhost nova_compute[297686]: tpm-crb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: emulator Oct 14 05:56:13 localhost nova_compute[297686]: external Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 2.0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pty Oct 14 05:56:13 localhost nova_compute[297686]: unix Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: passt Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: isa Oct 14 05:56:13 localhost nova_compute[297686]: hyperv Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: relaxed Oct 14 05:56:13 localhost nova_compute[297686]: vapic Oct 14 05:56:13 localhost nova_compute[297686]: spinlocks Oct 14 05:56:13 localhost nova_compute[297686]: vpindex Oct 14 05:56:13 localhost nova_compute[297686]: runtime Oct 14 05:56:13 localhost nova_compute[297686]: synic Oct 14 05:56:13 localhost nova_compute[297686]: stimer Oct 14 05:56:13 localhost nova_compute[297686]: reset Oct 14 05:56:13 localhost nova_compute[297686]: vendor_id Oct 14 05:56:13 localhost nova_compute[297686]: frequencies Oct 14 05:56:13 localhost nova_compute[297686]: reenlightenment Oct 14 05:56:13 localhost nova_compute[297686]: tlbflush Oct 14 05:56:13 localhost nova_compute[297686]: ipi Oct 14 05:56:13 localhost nova_compute[297686]: avic Oct 14 05:56:13 localhost nova_compute[297686]: emsr_bitmap Oct 14 05:56:13 localhost nova_compute[297686]: xmm_input Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.296 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/libexec/qemu-kvm Oct 14 05:56:13 localhost nova_compute[297686]: kvm Oct 14 05:56:13 localhost nova_compute[297686]: pc-i440fx-rhel7.6.0 Oct 14 05:56:13 localhost nova_compute[297686]: x86_64 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: /usr/share/OVMF/OVMF_CODE.secboot.fd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: rom Oct 14 05:56:13 localhost nova_compute[297686]: pflash Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: yes Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: no Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: on Oct 14 05:56:13 localhost nova_compute[297686]: off Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: AMD Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 486 Oct 14 05:56:13 localhost nova_compute[297686]: 486-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Broadwell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cascadelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Conroe Oct 14 05:56:13 localhost nova_compute[297686]: Conroe-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Cooperlake-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Denverton-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Dhyana-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Genoa-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-IBPB Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Milan-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-Rome-v4 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v1 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v2 Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: EPYC-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: GraniteRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Haswell-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-noTSX Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v6 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Icelake-Server-v7 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: IvyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: KnightsMill-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Nehalem-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G1-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G4-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Opteron_G5-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Penryn Oct 14 05:56:13 localhost nova_compute[297686]: Penryn-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: SandyBridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SapphireRapids-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: SierraForest-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Client-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-noTSX-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Skylake-Server-v5 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v2 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v3 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Snowridge-v4 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Westmere Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-IBRS Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Westmere-v2 Oct 14 05:56:13 localhost nova_compute[297686]: athlon Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: athlon-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: core2duo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: coreduo-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: kvm32 Oct 14 05:56:13 localhost nova_compute[297686]: kvm32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64 Oct 14 05:56:13 localhost nova_compute[297686]: kvm64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: n270 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: n270-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pentium Oct 14 05:56:13 localhost nova_compute[297686]: pentium-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2 Oct 14 05:56:13 localhost nova_compute[297686]: pentium2-v1 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3 Oct 14 05:56:13 localhost nova_compute[297686]: pentium3-v1 Oct 14 05:56:13 localhost nova_compute[297686]: phenom Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: phenom-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu32 Oct 14 05:56:13 localhost nova_compute[297686]: qemu32-v1 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64 Oct 14 05:56:13 localhost nova_compute[297686]: qemu64-v1 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: file Oct 14 05:56:13 localhost nova_compute[297686]: anonymous Oct 14 05:56:13 localhost nova_compute[297686]: memfd Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: disk Oct 14 05:56:13 localhost nova_compute[297686]: cdrom Oct 14 05:56:13 localhost nova_compute[297686]: floppy Oct 14 05:56:13 localhost nova_compute[297686]: lun Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: ide Oct 14 05:56:13 localhost nova_compute[297686]: fdc Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: sata Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: vnc Oct 14 05:56:13 localhost nova_compute[297686]: egl-headless Oct 14 05:56:13 localhost nova_compute[297686]: dbus Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: subsystem Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: mandatory Oct 14 05:56:13 localhost nova_compute[297686]: requisite Oct 14 05:56:13 localhost nova_compute[297686]: optional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: pci Oct 14 05:56:13 localhost nova_compute[297686]: scsi Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: virtio Oct 14 05:56:13 localhost nova_compute[297686]: virtio-transitional Oct 14 05:56:13 localhost nova_compute[297686]: virtio-non-transitional Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: random Oct 14 05:56:13 localhost nova_compute[297686]: egd Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: path Oct 14 05:56:13 localhost nova_compute[297686]: handle Oct 14 05:56:13 localhost nova_compute[297686]: virtiofs Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: tpm-tis Oct 14 05:56:13 localhost nova_compute[297686]: tpm-crb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: emulator Oct 14 05:56:13 localhost nova_compute[297686]: external Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: 2.0 Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: usb Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: pty Oct 14 05:56:13 localhost nova_compute[297686]: unix Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: qemu Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: builtin Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: default Oct 14 05:56:13 localhost nova_compute[297686]: passt Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: isa Oct 14 05:56:13 localhost nova_compute[297686]: hyperv Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: relaxed Oct 14 05:56:13 localhost nova_compute[297686]: vapic Oct 14 05:56:13 localhost nova_compute[297686]: spinlocks Oct 14 05:56:13 localhost nova_compute[297686]: vpindex Oct 14 05:56:13 localhost nova_compute[297686]: runtime Oct 14 05:56:13 localhost nova_compute[297686]: synic Oct 14 05:56:13 localhost nova_compute[297686]: stimer Oct 14 05:56:13 localhost nova_compute[297686]: reset Oct 14 05:56:13 localhost nova_compute[297686]: vendor_id Oct 14 05:56:13 localhost nova_compute[297686]: frequencies Oct 14 05:56:13 localhost nova_compute[297686]: reenlightenment Oct 14 05:56:13 localhost nova_compute[297686]: tlbflush Oct 14 05:56:13 localhost nova_compute[297686]: ipi Oct 14 05:56:13 localhost nova_compute[297686]: avic Oct 14 05:56:13 localhost nova_compute[297686]: emsr_bitmap Oct 14 05:56:13 localhost nova_compute[297686]: xmm_input Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: Oct 14 05:56:13 localhost nova_compute[297686]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.345 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.345 2 INFO nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Secure Boot support detected#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.347 2 INFO nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.347 2 INFO nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.355 2 DEBUG nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.402 2 INFO nova.virt.node [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Determined node identity 18c24273-aca2-4f08-be57-3188d558235e from /var/lib/nova/compute_id#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.433 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Verified node 18c24273-aca2-4f08-be57-3188d558235e matches my host np0005486733.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.481 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.486 2 DEBUG nova.virt.libvirt.vif [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:37:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005486733.localdomain',hostname='test',id=2,image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-10-14T08:37:23Z,launched_on='np0005486733.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005486733.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='41187b090f3d4818a32baa37ce8a3991',ramdisk_id='',reservation_id='r-aao7l1tg',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-10-14T08:37:23Z,user_data=None,user_id='9d85e6ce130c46ec855f37147dbb08b4',uuid=88c4e366-b765-47a6-96bf-f7677f2ce67c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.487 2 DEBUG nova.network.os_vif_util [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Converting VIF {"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.488 2 DEBUG nova.network.os_vif_util [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.489 2 DEBUG os_vif [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.573 2 DEBUG ovsdbapp.backend.ovs_idl [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.573 2 DEBUG ovsdbapp.backend.ovs_idl [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.573 2 DEBUG ovsdbapp.backend.ovs_idl [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.589 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.589 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:56:13 localhost nova_compute[297686]: 2025-10-14 09:56:13.590 2 INFO oslo.privsep.daemon [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp42jsah3t/privsep.sock']#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.223 2 INFO oslo.privsep.daemon [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.118 40 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.124 40 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.128 40 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.128 40 INFO oslo.privsep.daemon [-] privsep daemon running as pid 40#033[00m Oct 14 05:56:14 localhost python3.9[297904]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec9b060-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ec9b060-f4, col_values=(('external_ids', {'iface-id': '3ec9b060-f43d-4698-9c76-6062c70911d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:5e:e5', 'vm-uuid': '88c4e366-b765-47a6-96bf-f7677f2ce67c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.491 2 INFO os_vif [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4')#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.491 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.495 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.495 2 INFO nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Oct 14 05:56:14 localhost systemd[1]: Started libpod-conmon-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b.scope. Oct 14 05:56:14 localhost systemd[1]: Started libcrun container. Oct 14 05:56:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Oct 14 05:56:14 localhost podman[297931]: 2025-10-14 09:56:14.747118847 +0000 UTC m=+0.125508606 container init 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:56:14 localhost podman[297931]: 2025-10-14 09:56:14.756588148 +0000 UTC m=+0.134977917 container start 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}) Oct 14 05:56:14 localhost python3.9[297904]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Applying nova statedir ownership Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c already 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/88c4e366-b765-47a6-96bf-f7677f2ce67c/console.log Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/42de92eaa427bd35ce2c758a3a1fba782a57128e Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-42de92eaa427bd35ce2c758a3a1fba782a57128e Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/7dbe5bae7bc27ef07490c629ec1f09edaa9e8c135ff89c3f08f1e44f39cf5928 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/9469aff02825a9e3dcdb3ceeb358f8d540dc07c8b6e9cd975f170399051d29c3 Oct 14 05:56:14 localhost nova_compute_init[297951]: INFO:nova_statedir:Nova statedir ownership complete Oct 14 05:56:14 localhost systemd[1]: libpod-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b.scope: Deactivated successfully. Oct 14 05:56:14 localhost podman[297952]: 2025-10-14 09:56:14.833354612 +0000 UTC m=+0.058176322 container died 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.899 2 DEBUG oslo_concurrency.lockutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.900 2 DEBUG oslo_concurrency.lockutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.901 2 DEBUG oslo_concurrency.lockutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.901 2 DEBUG nova.compute.resource_tracker [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:56:14 localhost nova_compute[297686]: 2025-10-14 09:56:14.902 2 DEBUG oslo_concurrency.processutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:56:14 localhost podman[297965]: 2025-10-14 09:56:14.907434294 +0000 UTC m=+0.067405837 container cleanup 6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:56:14 localhost systemd[1]: libpod-conmon-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b.scope: Deactivated successfully. Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.334 2 DEBUG oslo_concurrency.processutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.391 2 DEBUG nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.391 2 DEBUG nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.589 2 WARNING nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.590 2 DEBUG nova.compute.resource_tracker [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11798MB free_disk=41.83725357055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.590 2 DEBUG oslo_concurrency.lockutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.590 2 DEBUG oslo_concurrency.lockutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:56:15 localhost systemd[1]: var-lib-containers-storage-overlay-4ac6a4b7d3fd9b3800ade157df8bbc87609a0105a3f06e00de1c30bcdbf261df-merged.mount: Deactivated successfully. Oct 14 05:56:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c4d7cae210e0acc65e1c4fc410517ec41d3f7534328f58bee37a38122214c4b-userdata-shm.mount: Deactivated successfully. Oct 14 05:56:15 localhost systemd[1]: session-61.scope: Deactivated successfully. Oct 14 05:56:15 localhost systemd[1]: session-61.scope: Consumed 1min 50.994s CPU time. Oct 14 05:56:15 localhost systemd-logind[760]: Session 61 logged out. Waiting for processes to exit. Oct 14 05:56:15 localhost systemd-logind[760]: Removed session 61. Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.876 2 DEBUG nova.compute.resource_tracker [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.876 2 DEBUG nova.compute.resource_tracker [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:56:15 localhost nova_compute[297686]: 2025-10-14 09:56:15.876 2 DEBUG nova.compute.resource_tracker [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.226 2 DEBUG nova.scheduler.client.report [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.250 2 DEBUG nova.scheduler.client.report [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.250 2 DEBUG nova.compute.provider_tree [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.266 2 DEBUG nova.scheduler.client.report [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.285 2 DEBUG nova.scheduler.client.report [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.316 2 DEBUG oslo_concurrency.processutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.763 2 DEBUG oslo_concurrency.processutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.768 2 DEBUG nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Oct 14 05:56:16 localhost nova_compute[297686]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.769 2 INFO nova.virt.libvirt.host [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] kernel doesn't support AMD SEV#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.771 2 DEBUG nova.compute.provider_tree [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.771 2 DEBUG nova.virt.libvirt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.795 2 DEBUG nova.scheduler.client.report [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.822 2 DEBUG nova.compute.resource_tracker [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.823 2 DEBUG oslo_concurrency.lockutils [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.823 2 DEBUG nova.service [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.850 2 DEBUG nova.service [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Oct 14 05:56:16 localhost nova_compute[297686]: 2025-10-14 09:56:16.850 2 DEBUG nova.servicegroup.drivers.db [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] DB_Driver: join new ServiceGroup member np0005486733.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Oct 14 05:56:17 localhost nova_compute[297686]: 2025-10-14 09:56:17.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:18 localhost nova_compute[297686]: 2025-10-14 09:56:18.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:22 localhost nova_compute[297686]: 2025-10-14 09:56:22.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2848 DF PROTO=TCP SPT=53840 DPT=9102 SEQ=879401917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B18C2160000000001030307) Oct 14 05:56:23 localhost nova_compute[297686]: 2025-10-14 09:56:23.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2849 DF PROTO=TCP SPT=53840 DPT=9102 SEQ=879401917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B18C61A0000000001030307) Oct 14 05:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:56:24 localhost podman[298052]: 2025-10-14 09:56:24.729735214 +0000 UTC m=+0.069374638 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:56:24 localhost podman[298052]: 2025-10-14 09:56:24.739187715 +0000 UTC m=+0.078827189 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:56:24 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:56:24 localhost podman[298053]: 2025-10-14 09:56:24.791140885 +0000 UTC m=+0.128066735 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Oct 14 05:56:24 localhost podman[298053]: 2025-10-14 09:56:24.827098342 +0000 UTC m=+0.164024222 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true) Oct 14 05:56:24 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:56:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2850 DF PROTO=TCP SPT=53840 DPT=9102 SEQ=879401917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B18CE1A0000000001030307) Oct 14 05:56:27 localhost nova_compute[297686]: 2025-10-14 09:56:27.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:28 localhost podman[248187]: time="2025-10-14T09:56:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:56:28 localhost podman[248187]: @ - - [14/Oct/2025:09:56:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:56:28 localhost podman[248187]: @ - - [14/Oct/2025:09:56:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18353 "" "Go-http-client/1.1" Oct 14 05:56:28 localhost nova_compute[297686]: 2025-10-14 09:56:28.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2851 DF PROTO=TCP SPT=53840 DPT=9102 SEQ=879401917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B18DDDA0000000001030307) Oct 14 05:56:32 localhost nova_compute[297686]: 2025-10-14 09:56:32.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:56:33 localhost nova_compute[297686]: 2025-10-14 09:56:33.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:33 localhost podman[298093]: 2025-10-14 09:56:33.733460904 +0000 UTC m=+0.078358875 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 05:56:33 localhost podman[298093]: 2025-10-14 09:56:33.742552791 +0000 UTC m=+0.087450772 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 05:56:33 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:56:35 localhost podman[298112]: 2025-10-14 09:56:35.742701618 +0000 UTC m=+0.080665337 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:56:35 localhost podman[298112]: 2025-10-14 09:56:35.780167586 +0000 UTC m=+0.118131315 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:56:35 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:56:37 localhost nova_compute[297686]: 2025-10-14 09:56:37.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:56:37 localhost podman[298135]: 2025-10-14 09:56:37.737946999 +0000 UTC m=+0.074132291 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:56:37 localhost podman[298135]: 2025-10-14 09:56:37.752022025 +0000 UTC m=+0.088207367 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:56:37 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:56:38 localhost nova_compute[297686]: 2025-10-14 09:56:38.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:38 localhost openstack_network_exporter[250374]: ERROR 09:56:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:56:38 localhost openstack_network_exporter[250374]: ERROR 09:56:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:56:38 localhost openstack_network_exporter[250374]: ERROR 09:56:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:56:38 localhost openstack_network_exporter[250374]: ERROR 09:56:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:56:38 localhost openstack_network_exporter[250374]: Oct 14 05:56:38 localhost openstack_network_exporter[250374]: ERROR 09:56:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:56:38 localhost openstack_network_exporter[250374]: Oct 14 05:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:56:41 localhost podman[298154]: 2025-10-14 09:56:41.728221844 +0000 UTC m=+0.074738960 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 05:56:41 localhost podman[298154]: 2025-10-14 09:56:41.818249127 +0000 UTC m=+0.164766233 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:56:41 localhost systemd[1]: tmp-crun.vP5df4.mount: Deactivated successfully. Oct 14 05:56:41 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:56:41 localhost podman[298155]: 2025-10-14 09:56:41.825021072 +0000 UTC m=+0.168674477 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 05:56:41 localhost podman[298155]: 2025-10-14 09:56:41.909081136 +0000 UTC m=+0.252734531 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=) Oct 14 05:56:41 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:56:42 localhost nova_compute[297686]: 2025-10-14 09:56:42.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:56:43 localhost nova_compute[297686]: 2025-10-14 09:56:43.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:43 localhost podman[298199]: 2025-10-14 09:56:43.730408275 +0000 UTC m=+0.077061814 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009) Oct 14 05:56:43 localhost podman[298199]: 2025-10-14 09:56:43.767927514 +0000 UTC m=+0.114581043 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Oct 14 05:56:43 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:56:44 localhost nova_compute[297686]: 2025-10-14 09:56:44.288 2 DEBUG nova.compute.manager [None req-d15b3fde-d553-4b7c-b094-26acfc2f7765 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:56:44 localhost nova_compute[297686]: 2025-10-14 09:56:44.293 2 INFO nova.compute.manager [None req-d15b3fde-d553-4b7c-b094-26acfc2f7765 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Retrieving diagnostics#033[00m Oct 14 05:56:47 localhost nova_compute[297686]: 2025-10-14 09:56:47.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:48 localhost nova_compute[297686]: 2025-10-14 09:56:48.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.817 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.817 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.841 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1400578039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.842 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 174873109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '568d6f68-1687-48d6-a869-70b1565081d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1400578039, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.818142', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a75bcca-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '3792bf3840fbd7d5191036569422bb5f40d3b813f12bb7336acab41eb2943708'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174873109, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.818142', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a75ccb0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '7a2109fc965b9ec320d505f930142ec69f22b9e5f7723695b7623f618325e7fd'}]}, 'timestamp': '2025-10-14 09:56:49.842503', '_unique_id': '25126b48f5fe45868001ef9ea97daf7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.843 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.844 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 73895936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.845 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ad7bb39-f400-41c1-af4f-9aaa88771fbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73895936, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.844635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a763358-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': 'c7d2b89da2cb50a7e3fcc46761cb974cd2b374b631c0d3b74d6874b8f4f8e1e6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.844635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a764398-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '4356b2b1afcbed19993d61d8b52be200f1319b7f20eaabc8d8fceeb2b217475d'}]}, 'timestamp': '2025-10-14 09:56:49.845547', '_unique_id': '6aedea5183b24ed8995498a7d44e1632'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.846 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.847 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.851 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 9226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b11bebe-409c-49b9-acbe-02ffcc144642', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9226, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.847897', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a7734c4-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '85bea1628a23d2d6abe69fae43205a7dc4ccb08caf7a33da0ba94cb98c0adf88'}]}, 'timestamp': '2025-10-14 09:56:49.851750', '_unique_id': '6f4870fa3ea24ed2aa3c096d94acec46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.853 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.876 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 66490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1eb2964-6334-4fd6-9163-4fc392d8d2cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 66490000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:56:49.853226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1a7b0004-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.068672839, 'message_signature': 'abf3b3bedfc6a3c7cb2217c6e2eff487b9f2ec2fa29fc5a22c94e5abc83801a4'}]}, 'timestamp': '2025-10-14 09:56:49.876693', '_unique_id': '72de88ea006c4cbda7b6b1bd6e6b56ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.878 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.878 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.878 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c7db842-b6b5-4fff-9e22-2c29e3e08234', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.878796', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a7b7296-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '8e3e0875aca051ab5f372cee81a1dcb493c79d9e004ee04c9fbec9bbf65b6e7d'}]}, 'timestamp': '2025-10-14 09:56:49.879614', '_unique_id': 'b5167fa6153743048de370eb20772c9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.881 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2718552-0f25-4dac-a2ce-3e053429f639', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.881825', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a7bdd3a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '2ddcd3ca12a1684dfe392878cf0e9f8e7fd5c59eca0a9189b27a99941aac7c7e'}]}, 'timestamp': '2025-10-14 09:56:49.882264', '_unique_id': '1b6c97a4ee9247c98047d062ae095fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.884 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.899 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b82a3d5-e6a2-4444-aec5-62f266bca19c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.884236', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a7e9a02-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.076941031, 'message_signature': '81944f06f9823de14390a9d0dd3e2c4f571d6bd5b4b4f331d1c1fbb4e61d2eef'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.884236', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a7ea894-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.076941031, 'message_signature': 'f186500650f0bbb4b19e47dab7e5142d59ec45f6e1608814777d9324e13fc128'}]}, 'timestamp': '2025-10-14 09:56:49.900549', '_unique_id': 'e2e6e7cb52db4df2a3d827e2a64a3440'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e31093dc-19a6-40a5-8d71-762ba9f4996e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.902329', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a7efe48-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '2166921409ac0cdc3a58f355467ede7c54cbe554a7963a97622b795c12c2841b'}]}, 'timestamp': '2025-10-14 09:56:49.902816', '_unique_id': 'd77e0d77067b45b3b685eaa0eb984d72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 217051898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.905 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 24279644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd99b59d-866a-45ad-bea7-919d9b2f6f80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217051898, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.904862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a7f60fe-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': 'a6f04c2b9fb78979b9e01f745d37dc1a4cd1faf08e86b3ac2a25d301cb27fdf8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24279644, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.904862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a7f7094-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '4abd73fee2fcd21b25c062e1bd6df50059602f969d2b6c5dc4cbe893d759da51'}]}, 'timestamp': '2025-10-14 09:56:49.905701', '_unique_id': '2dad655069484a7996ecfc764110faa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de3e3c81-a5e5-4545-8f62-c52883a4136c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.907697', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a7fd016-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '9ef1a87a3efc1af0af59eb62c05afd3b5cd96f4e0a6f32d957d573b75b7bad18'}]}, 'timestamp': '2025-10-14 09:56:49.908139', '_unique_id': '0a0f141445df45d18ca2c4a15621ad9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '984c9da2-241e-4530-a67d-0d93371154c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.910532', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a8040c8-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '02f344fdefa62cbef85cd82bfd2f50ee41c9c287f2018d83e751a0e9eec19028'}]}, 'timestamp': '2025-10-14 09:56:49.911067', '_unique_id': '843d2bfed924467795701253b5fcf94a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.913 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9ed4709-b8d1-4e7f-969a-2dab43ac9070', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.913563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a80b738-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': 'bcab2e02a19f2e9c4b8b5830f2998edce463b0847962b26378db832982dc5759'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.913563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a80c85e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '5dff1452fb1a2d616ef7ce89a4f280df0b3c4827b0d2d1d853c81bb8e3b5f2c1'}]}, 'timestamp': '2025-10-14 09:56:49.914494', '_unique_id': '1c103064bb55470d859df9662477329a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.916 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 52.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '704d7ccd-6a29-4b55-8ae6-b882a8bf11a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.32421875, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:56:49.916974', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1a813b68-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.068672839, 'message_signature': 'f94bb945a52128f2f429d5c4d799361f694762122ae0cba096f2b1b80768b3aa'}]}, 'timestamp': '2025-10-14 09:56:49.917463', '_unique_id': '6432e713cff44805974d1b3f262b6aaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a36e5b1a-5716-48b0-9d1f-e9df9f9c25e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.920168', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a81b818-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.076941031, 'message_signature': '5f183a31d8e4b6506b97237774976784f0db310a4fcc3f620f4ef6a216ad5068'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.920168', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a81cf2e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.076941031, 'message_signature': 'ac5242912ced1cac7aba4c1e937af5f9767975b84e0fd58f902bcadfdc54da68'}]}, 'timestamp': '2025-10-14 09:56:49.921234', '_unique_id': '66a2c4dc18ca493797ae9c410487fa84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8957974-8911-4a3c-a060-3bff2f9c4564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.923589', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a823f0e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': 'b29415b6e8e225e672e170a7eb7a0d0fe86ab9c0f3b62321df035a3587b7b873'}]}, 'timestamp': '2025-10-14 09:56:49.924123', '_unique_id': 'fee223984d4f4d91a4dc94e12142b1a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e6a9014-3c75-48e5-9666-d5f12ff8128f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.926042', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a8299a4-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '5c1c5ac8c9cdea8eb302a5f7f6a9ae34df2b738556f4e42134f96af06c63fbdc'}]}, 'timestamp': '2025-10-14 09:56:49.926334', '_unique_id': '0ca3e31e65db4020a33367f6d9c4c81b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.927 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83ecdefb-3f02-4344-b95b-ce05a75ecaf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.927916', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a82e27e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '63be7f3203aa6d08a8a687df55e808ba0603e60f48489faf772be5f36fbd0cad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.927916', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a82ecb0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '50f11aac4cb1794c587ddd321c2e8ef75a9298f8cce5eb8a3067121cf2ec169d'}]}, 'timestamp': '2025-10-14 09:56:49.928440', '_unique_id': '3d8f1b034a92450c8757eacf4b2c1f36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93a7e0c9-5769-4e6b-a065-f09d09e51ba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.929793', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a832c0c-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': '5ef496ee6ae8141e12e9d05cd40b2b62886ecc2a65de6991621f120579a13dce'}]}, 'timestamp': '2025-10-14 09:56:49.930079', '_unique_id': 'b23a8afa52c646548a57cbae0192b0d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.931 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.931 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '164a2f87-a3d3-431e-bdb4-74187db3a5e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.931361', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a83691a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.076941031, 'message_signature': '0af634d2d88619efe5645c46edbc648e6046d6d75b9154a24bc698c0e020717a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.931361', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a8373e2-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.076941031, 'message_signature': '9333a0644be89ad3c977e77cfa31f05ca65ccc7a2a54025f2db8b0107d7fbbd7'}]}, 'timestamp': '2025-10-14 09:56:49.931901', '_unique_id': '442f60eac2024851beccd3d9e071888e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95460965-f270-49ed-91f0-3b11624425ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 525, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:56:49.933235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a83b24e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': '7ac9ea6435b929d9dfdd66f19ae8a0a9f6d570e48cd7032062b0d1a2a2d13f38'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:56:49.933235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a83bbea-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.010838457, 'message_signature': 'c86ba4586052c1505523009467de2c525cccd37746b982e6daca25b833e48be3'}]}, 'timestamp': '2025-10-14 09:56:49.933763', '_unique_id': '5cb2d51320e34012a4f833643b928f28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '232f274e-66f3-4ebd-9ee8-56e53ec84d0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:56:49.935077', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '1a83fa74-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11826.040601589, 'message_signature': 'ce0ee8f1ec829d6b15fbaa7bc176cfabe1111d2e1f74e34eaa0ffc21b010711a'}]}, 'timestamp': '2025-10-14 09:56:49.935366', '_unique_id': 'a6aaf9f839734b3083e970855919af04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:56:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:56:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 05:56:50 localhost nova_compute[297686]: 2025-10-14 09:56:50.180 2 DEBUG oslo_concurrency.lockutils [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:56:50 localhost nova_compute[297686]: 2025-10-14 09:56:50.180 2 DEBUG oslo_concurrency.lockutils [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:56:50 localhost nova_compute[297686]: 2025-10-14 09:56:50.181 2 DEBUG nova.compute.manager [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:56:50 localhost nova_compute[297686]: 2025-10-14 09:56:50.185 2 DEBUG nova.compute.manager [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Oct 14 05:56:50 localhost nova_compute[297686]: 2025-10-14 09:56:50.189 2 DEBUG nova.objects.instance [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'flavor' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:56:50 localhost nova_compute[297686]: 2025-10-14 09:56:50.238 2 DEBUG nova.virt.libvirt.driver [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Oct 14 05:56:52 localhost nova_compute[297686]: 2025-10-14 09:56:52.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:52 localhost kernel: device tap3ec9b060-f4 left promiscuous mode Oct 14 05:56:52 localhost NetworkManager[5977]: [1760435812.7249] device (tap3ec9b060-f4): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Oct 14 05:56:52 localhost ovn_controller[157396]: 2025-10-14T09:56:52Z|00056|binding|INFO|Releasing lport 3ec9b060-f43d-4698-9c76-6062c70911d5 from this chassis (sb_readonly=0) Oct 14 05:56:52 localhost ovn_controller[157396]: 2025-10-14T09:56:52Z|00057|binding|INFO|Setting lport 3ec9b060-f43d-4698-9c76-6062c70911d5 down in Southbound Oct 14 05:56:52 localhost ovn_controller[157396]: 2025-10-14T09:56:52Z|00058|binding|INFO|Removing iface tap3ec9b060-f4 ovn-installed in OVS Oct 14 05:56:52 localhost nova_compute[297686]: 2025-10-14 09:56:52.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:52 localhost nova_compute[297686]: 2025-10-14 09:56:52.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:52.748 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:5e:e5 192.168.0.46'], port_security=['fa:16:3e:84:5e:e5 192.168.0.46'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.46/24', 'neutron:device_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005486733.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '6', 'neutron:security_group_ids': '313d605c-14d3-4f16-b913-a4f55afa256e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d31a249-7ee5-4da6-a9d1-dab19bbf097c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3ec9b060-f43d-4698-9c76-6062c70911d5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:56:52 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Oct 14 05:56:52 localhost nova_compute[297686]: 2025-10-14 09:56:52.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:52 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 4min 14.846s CPU time. Oct 14 05:56:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:52.751 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3ec9b060-f43d-4698-9c76-6062c70911d5 in datapath 7d0cd696-bdd7-4e70-9512-eb0d23640314 unbound from our chassis#033[00m Oct 14 05:56:52 localhost systemd-machined[84684]: Machine qemu-1-instance-00000002 terminated. Oct 14 05:56:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:52.754 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 85b3ca3e-8aac-4a2f-8ce5-3542f4a390a8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 05:56:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:52.754 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d0cd696-bdd7-4e70-9512-eb0d23640314, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 05:56:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:52.756 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[7d261fed-c36f-433c-8c09-2aa227d9a7c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:52 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:52.758 163055 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314 namespace which is not needed anymore#033[00m Oct 14 05:56:52 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [NOTICE] (272449) : haproxy version is 2.8.14-c23fe91 Oct 14 05:56:52 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [NOTICE] (272449) : path to executable is /usr/sbin/haproxy Oct 14 05:56:52 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [WARNING] (272449) : Exiting Master process... Oct 14 05:56:52 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [ALERT] (272449) : Current worker (272451) exited with code 143 (Terminated) Oct 14 05:56:52 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[272445]: [WARNING] (272449) : All workers exited. Exiting... (0) Oct 14 05:56:52 localhost systemd[1]: libpod-4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9.scope: Deactivated successfully. Oct 14 05:56:52 localhost podman[298386]: 2025-10-14 09:56:52.933209199 +0000 UTC m=+0.069394180 container died 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 05:56:52 localhost nova_compute[297686]: 2025-10-14 09:56:52.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9-userdata-shm.mount: Deactivated successfully. Oct 14 05:56:52 localhost nova_compute[297686]: 2025-10-14 09:56:52.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:52 localhost systemd[1]: var-lib-containers-storage-overlay-f85672eda8adebaf08be6b9a7103250b5ec8f68390658a7a5c3d072f58e6ea08-merged.mount: Deactivated successfully. Oct 14 05:56:52 localhost podman[298386]: 2025-10-14 09:56:52.982833141 +0000 UTC m=+0.119018042 container cleanup 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:56:53 localhost podman[298400]: 2025-10-14 09:56:53.00865058 +0000 UTC m=+0.071995563 container cleanup 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:56:53 localhost systemd[1]: libpod-conmon-4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9.scope: Deactivated successfully. Oct 14 05:56:53 localhost podman[298424]: 2025-10-14 09:56:53.063408285 +0000 UTC m=+0.062143080 container remove 4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.067 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6bf64e-0c1b-484d-9ae7-849a7c505cc9]: (4, ('Tue Oct 14 09:56:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314 (4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9)\n4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9\nTue Oct 14 09:56:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314 (4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9)\n4b838135b4c72f9f42d47a726a45b40fda069a31a29065794f51235ffa0efcb9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.069 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7531a4-e4a3-451b-a244-35f3f80519d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.070 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d0cd696-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:53 localhost kernel: device tap7d0cd696-b0 left promiscuous mode Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.088 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0699e4-bc6e-495e-96f4-a0a81b5daf14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.100 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6deb8e-c079-4a13-af1e-dd94f4aed244]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.102 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[5d360b1b-56e8-42ce-91f2-80509df8809c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.113 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[73cc987f-3625-4184-ad67-36853391c6ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705859, 'reachable_time': 22067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298445, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.122 163190 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.123 163190 DEBUG oslo.privsep.daemon [-] privsep: reply[0999b7a3-3ad0-45ad-8e9d-646f0ef0991d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.253 2 INFO nova.virt.libvirt.driver [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Instance shutdown successfully after 3 seconds.#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.261 2 INFO nova.virt.libvirt.driver [-] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Instance destroyed successfully.#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.262 2 DEBUG nova.objects.instance [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'numa_topology' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.305 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.305 2 DEBUG nova.compute.manager [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:56:53 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:53.306 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.391 2 DEBUG oslo_concurrency.lockutils [None req-8869782d-a6f0-4e99-8359-221982f385b5 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.394 2 DEBUG nova.compute.manager [req-54728aaf-d553-4938-86c5-87db6f9c5be0 req-d1d27e97-ad62-4542-b627-4ec49edb579a da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received event network-vif-unplugged-3ec9b060-f43d-4698-9c76-6062c70911d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.394 2 DEBUG oslo_concurrency.lockutils [req-54728aaf-d553-4938-86c5-87db6f9c5be0 req-d1d27e97-ad62-4542-b627-4ec49edb579a da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.394 2 DEBUG oslo_concurrency.lockutils [req-54728aaf-d553-4938-86c5-87db6f9c5be0 req-d1d27e97-ad62-4542-b627-4ec49edb579a da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.395 2 DEBUG oslo_concurrency.lockutils [req-54728aaf-d553-4938-86c5-87db6f9c5be0 req-d1d27e97-ad62-4542-b627-4ec49edb579a da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.395 2 DEBUG nova.compute.manager [req-54728aaf-d553-4938-86c5-87db6f9c5be0 req-d1d27e97-ad62-4542-b627-4ec49edb579a da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] No waiting events found dispatching network-vif-unplugged-3ec9b060-f43d-4698-9c76-6062c70911d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.395 2 WARNING nova.compute.manager [req-54728aaf-d553-4938-86c5-87db6f9c5be0 req-d1d27e97-ad62-4542-b627-4ec49edb579a da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received unexpected event network-vif-unplugged-3ec9b060-f43d-4698-9c76-6062c70911d5 for instance with vm_state stopped and task_state None.#033[00m Oct 14 05:56:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64715 DF PROTO=TCP SPT=54114 DPT=9102 SEQ=2983141316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1937450000000001030307) Oct 14 05:56:53 localhost nova_compute[297686]: 2025-10-14 09:56:53.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:53 localhost systemd[1]: run-netns-ovnmeta\x2d7d0cd696\x2dbdd7\x2d4e70\x2d9512\x2deb0d23640314.mount: Deactivated successfully. Oct 14 05:56:54 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:54.309 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:56:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64716 DF PROTO=TCP SPT=54114 DPT=9102 SEQ=2983141316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B193B5A0000000001030307) Oct 14 05:56:55 localhost nova_compute[297686]: 2025-10-14 09:56:55.446 2 DEBUG nova.compute.manager [req-7900b832-4b18-40b7-883d-6738e115ea29 req-e7067360-e9c8-4e84-9213-622367616952 da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received event network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Oct 14 05:56:55 localhost nova_compute[297686]: 2025-10-14 09:56:55.447 2 DEBUG oslo_concurrency.lockutils [req-7900b832-4b18-40b7-883d-6738e115ea29 req-e7067360-e9c8-4e84-9213-622367616952 da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:56:55 localhost nova_compute[297686]: 2025-10-14 09:56:55.447 2 DEBUG oslo_concurrency.lockutils [req-7900b832-4b18-40b7-883d-6738e115ea29 req-e7067360-e9c8-4e84-9213-622367616952 da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:56:55 localhost nova_compute[297686]: 2025-10-14 09:56:55.447 2 DEBUG oslo_concurrency.lockutils [req-7900b832-4b18-40b7-883d-6738e115ea29 req-e7067360-e9c8-4e84-9213-622367616952 da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:56:55 localhost nova_compute[297686]: 2025-10-14 09:56:55.448 2 DEBUG nova.compute.manager [req-7900b832-4b18-40b7-883d-6738e115ea29 req-e7067360-e9c8-4e84-9213-622367616952 da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] No waiting events found dispatching network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Oct 14 05:56:55 localhost nova_compute[297686]: 2025-10-14 09:56:55.448 2 WARNING nova.compute.manager [req-7900b832-4b18-40b7-883d-6738e115ea29 req-e7067360-e9c8-4e84-9213-622367616952 da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received unexpected event network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 for instance with vm_state stopped and task_state None.#033[00m Oct 14 05:56:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:56:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:56:55 localhost systemd[1]: tmp-crun.uFO7K9.mount: Deactivated successfully. Oct 14 05:56:55 localhost podman[298447]: 2025-10-14 09:56:55.756362197 +0000 UTC m=+0.096266202 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 05:56:55 localhost podman[298448]: 2025-10-14 09:56:55.786897904 +0000 UTC m=+0.124493796 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:56:55 localhost podman[298448]: 2025-10-14 09:56:55.792619905 +0000 UTC m=+0.130215777 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009) Oct 14 05:56:55 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:56:55 localhost podman[298447]: 2025-10-14 09:56:55.813650902 +0000 UTC m=+0.153554897 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:56:55 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.318 2 DEBUG nova.compute.manager [None req-e6d9e092-78a6-45ec-9fdd-974c9a9e867e 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server [None req-e6d9e092-78a6-45ec-9fdd-974c9a9e867e 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c in power state shutdown. Cannot get_diagnostics while the instance is in this state. Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server self.force_reraise() Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server raise self.value Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server self.force_reraise() Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server raise self.value Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c in power state shutdown. Cannot get_diagnostics while the instance is in this state. Oct 14 05:56:56 localhost nova_compute[297686]: 2025-10-14 09:56:56.339 2 ERROR oslo_messaging.rpc.server #033[00m Oct 14 05:56:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64717 DF PROTO=TCP SPT=54114 DPT=9102 SEQ=2983141316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B19435A0000000001030307) Oct 14 05:56:57 localhost nova_compute[297686]: 2025-10-14 09:56:57.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:56:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:57.765 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:56:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:57.766 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:56:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:56:57.766 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:56:58 localhost podman[248187]: time="2025-10-14T09:56:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:56:58 localhost podman[248187]: @ - - [14/Oct/2025:09:56:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 139971 "" "Go-http-client/1.1" Oct 14 05:56:58 localhost podman[248187]: @ - - [14/Oct/2025:09:56:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17866 "" "Go-http-client/1.1" Oct 14 05:56:58 localhost nova_compute[297686]: 2025-10-14 09:56:58.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64718 DF PROTO=TCP SPT=54114 DPT=9102 SEQ=2983141316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B19531A0000000001030307) Oct 14 05:57:02 localhost nova_compute[297686]: 2025-10-14 09:57:02.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:03 localhost nova_compute[297686]: 2025-10-14 09:57:03.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:57:04 localhost podman[298489]: 2025-10-14 09:57:04.725500605 +0000 UTC m=+0.070897107 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, org.label-schema.vendor=CentOS) Oct 14 05:57:04 localhost podman[298489]: 2025-10-14 09:57:04.738239069 +0000 UTC m=+0.083635631 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3) Oct 14 05:57:04 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:57:06 localhost podman[298507]: 2025-10-14 09:57:06.734061898 +0000 UTC m=+0.079844801 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:57:06 localhost podman[298507]: 2025-10-14 09:57:06.745082428 +0000 UTC m=+0.090865411 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:57:06 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:57:07 localhost nova_compute[297686]: 2025-10-14 09:57:07.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:07 localhost nova_compute[297686]: 2025-10-14 09:57:07.964 2 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Oct 14 05:57:07 localhost nova_compute[297686]: 2025-10-14 09:57:07.964 2 INFO nova.compute.manager [-] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] VM Stopped (Lifecycle Event)#033[00m Oct 14 05:57:07 localhost nova_compute[297686]: 2025-10-14 09:57:07.990 2 DEBUG nova.compute.manager [None req-e35c919b-d958-4955-894b-ee01826f5378 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:57:07 localhost nova_compute[297686]: 2025-10-14 09:57:07.993 2 DEBUG nova.compute.manager [None req-e35c919b-d958-4955-894b-ee01826f5378 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Oct 14 05:57:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:57:08 localhost podman[298531]: 2025-10-14 09:57:08.702910483 +0000 UTC m=+0.051615296 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 05:57:08 localhost podman[298531]: 2025-10-14 09:57:08.718402524 +0000 UTC m=+0.067107387 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009) Oct 14 05:57:08 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:57:08 localhost openstack_network_exporter[250374]: ERROR 09:57:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:57:08 localhost openstack_network_exporter[250374]: ERROR 09:57:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:57:08 localhost openstack_network_exporter[250374]: ERROR 09:57:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:57:08 localhost openstack_network_exporter[250374]: ERROR 09:57:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:57:08 localhost openstack_network_exporter[250374]: Oct 14 05:57:08 localhost openstack_network_exporter[250374]: ERROR 09:57:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:57:08 localhost openstack_network_exporter[250374]: Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.853 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.870 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.871 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.872 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.872 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:08 localhost nova_compute[297686]: 2025-10-14 09:57:08.895 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:57:12 localhost nova_compute[297686]: 2025-10-14 09:57:12.306 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:12 localhost nova_compute[297686]: 2025-10-14 09:57:12.306 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:12 localhost nova_compute[297686]: 2025-10-14 09:57:12.307 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:57:12 localhost nova_compute[297686]: 2025-10-14 09:57:12.307 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:57:12 localhost nova_compute[297686]: 2025-10-14 09:57:12.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:57:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:57:12 localhost podman[298551]: 2025-10-14 09:57:12.741313993 +0000 UTC m=+0.080735750 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 05:57:12 localhost podman[298552]: 2025-10-14 09:57:12.795867601 +0000 UTC m=+0.130518787 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 05:57:12 localhost podman[298551]: 2025-10-14 09:57:12.809173033 +0000 UTC m=+0.148594730 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 05:57:12 localhost podman[298552]: 2025-10-14 09:57:12.81097199 +0000 UTC m=+0.145623226 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git) Oct 14 05:57:12 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:57:12 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:57:13 localhost nova_compute[297686]: 2025-10-14 09:57:13.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:13.999 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.000 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.000 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.000 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.476 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.505 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.506 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.506 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.507 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.507 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.508 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.508 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.509 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.509 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.510 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.534 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.534 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.535 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.535 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.536 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:57:14 localhost podman[298596]: 2025-10-14 09:57:14.751848538 +0000 UTC m=+0.095875139 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:57:14 localhost podman[298596]: 2025-10-14 09:57:14.759702847 +0000 UTC m=+0.103729408 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:57:14 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:57:14 localhost nova_compute[297686]: 2025-10-14 09:57:14.990 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.062 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.063 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.255 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.257 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=12230MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.257 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.258 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.328 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.329 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.329 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.376 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.837 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.844 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.864 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.907 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:57:15 localhost nova_compute[297686]: 2025-10-14 09:57:15.908 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.727 2 DEBUG nova.compute.manager [None req-491607c6-a58e-444e-8bbc-cc2cdcf0791e 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server [None req-491607c6-a58e-444e-8bbc-cc2cdcf0791e 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c in power state shutdown. Cannot get_diagnostics while the instance is in this state. Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server self.force_reraise() Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server raise self.value Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server self.force_reraise() Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server raise self.value Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c in power state shutdown. Cannot get_diagnostics while the instance is in this state. Oct 14 05:57:17 localhost nova_compute[297686]: 2025-10-14 09:57:17.757 2 ERROR oslo_messaging.rpc.server #033[00m Oct 14 05:57:18 localhost nova_compute[297686]: 2025-10-14 09:57:18.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:22 localhost nova_compute[297686]: 2025-10-14 09:57:22.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:22 localhost ovn_controller[157396]: 2025-10-14T09:57:22Z|00059|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory Oct 14 05:57:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16386 DF PROTO=TCP SPT=55182 DPT=9102 SEQ=3320539066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B19AC740000000001030307) Oct 14 05:57:23 localhost nova_compute[297686]: 2025-10-14 09:57:23.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.193 2 DEBUG nova.objects.instance [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'flavor' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.211 2 DEBUG oslo_concurrency.lockutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.212 2 DEBUG oslo_concurrency.lockutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.212 2 DEBUG nova.network.neutron [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.213 2 DEBUG nova.objects.instance [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.532 2 DEBUG nova.network.neutron [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.556 2 DEBUG oslo_concurrency.lockutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:57:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16387 DF PROTO=TCP SPT=55182 DPT=9102 SEQ=3320539066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B19B09B0000000001030307) Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.580 2 INFO nova.virt.libvirt.driver [-] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Instance destroyed successfully.#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.580 2 DEBUG nova.objects.instance [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'numa_topology' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.591 2 DEBUG nova.objects.instance [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'resources' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.601 2 DEBUG nova.virt.libvirt.vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:37:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005486733.localdomain',hostname='test',id=2,image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-10-14T08:37:23Z,launched_on='np0005486733.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005486733.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='41187b090f3d4818a32baa37ce8a3991',ramdisk_id='',reservation_id='r-aao7l1tg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-10-14T09:56:53Z,user_data=None,user_id='9d85e6ce130c46ec855f37147dbb08b4',uuid=88c4e366-b765-47a6-96bf-f7677f2ce67c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.601 2 DEBUG nova.network.os_vif_util [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Converting VIF {"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.603 2 DEBUG nova.network.os_vif_util [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.603 2 DEBUG os_vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.606 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec9b060-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.614 2 INFO os_vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4')#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.617 2 DEBUG nova.virt.libvirt.host [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.617 2 INFO nova.virt.libvirt.host [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] UEFI support detected#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.625 2 DEBUG nova.virt.libvirt.driver [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Start _get_guest_xml network_info=[{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=0c25fd0b-0cde-472d-a2dc-9e548eac7c4d,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'image_id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}], 'ephemerals': [{'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'size': 1, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vdb'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.629 2 WARNING nova.virt.libvirt.driver [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.632 2 DEBUG nova.virt.libvirt.host [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Searching host: 'np0005486733.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.633 2 DEBUG nova.virt.libvirt.host [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.634 2 DEBUG nova.virt.libvirt.host [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Searching host: 'np0005486733.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.635 2 DEBUG nova.virt.libvirt.host [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.636 2 DEBUG nova.virt.libvirt.driver [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.636 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-14T08:36:19Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='36e4c2a8-ca99-4c45-8719-dd5129265531',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=0c25fd0b-0cde-472d-a2dc-9e548eac7c4d,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.637 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.641 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.642 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.643 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.643 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.644 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.644 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.644 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.645 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.645 2 DEBUG nova.virt.hardware [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.646 2 DEBUG nova.objects.instance [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.664 2 DEBUG nova.privsep.utils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Oct 14 05:57:24 localhost nova_compute[297686]: 2025-10-14 09:57:24.664 2 DEBUG oslo_concurrency.processutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.107 2 DEBUG oslo_concurrency.processutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.109 2 DEBUG oslo_concurrency.processutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.520 2 DEBUG oslo_concurrency.processutils [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.522 2 DEBUG nova.virt.libvirt.vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:37:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005486733.localdomain',hostname='test',id=2,image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-10-14T08:37:23Z,launched_on='np0005486733.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005486733.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='41187b090f3d4818a32baa37ce8a3991',ramdisk_id='',reservation_id='r-aao7l1tg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-10-14T09:56:53Z,user_data=None,user_id='9d85e6ce130c46ec855f37147dbb08b4',uuid=88c4e366-b765-47a6-96bf-f7677f2ce67c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.523 2 DEBUG nova.network.os_vif_util [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Converting VIF {"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.524 2 DEBUG nova.network.os_vif_util [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.525 2 DEBUG nova.objects.instance [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.539 2 DEBUG nova.virt.libvirt.driver [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] End _get_guest_xml xml= Oct 14 05:57:25 localhost nova_compute[297686]: 88c4e366-b765-47a6-96bf-f7677f2ce67c Oct 14 05:57:25 localhost nova_compute[297686]: instance-00000002 Oct 14 05:57:25 localhost nova_compute[297686]: 524288 Oct 14 05:57:25 localhost nova_compute[297686]: 1 Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: test Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:24 Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: 512 Oct 14 05:57:25 localhost nova_compute[297686]: 1 Oct 14 05:57:25 localhost nova_compute[297686]: 0 Oct 14 05:57:25 localhost nova_compute[297686]: 1 Oct 14 05:57:25 localhost nova_compute[297686]: 1 Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: admin Oct 14 05:57:25 localhost nova_compute[297686]: admin Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: RDO Oct 14 05:57:25 localhost nova_compute[297686]: OpenStack Compute Oct 14 05:57:25 localhost nova_compute[297686]: 27.5.2-0.20250829104910.6f8decf.el9 Oct 14 05:57:25 localhost nova_compute[297686]: 88c4e366-b765-47a6-96bf-f7677f2ce67c Oct 14 05:57:25 localhost nova_compute[297686]: 88c4e366-b765-47a6-96bf-f7677f2ce67c Oct 14 05:57:25 localhost nova_compute[297686]: Virtual Machine Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: hvm Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: /dev/urandom Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: Oct 14 05:57:25 localhost nova_compute[297686]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.540 2 DEBUG nova.virt.libvirt.driver [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.541 2 DEBUG nova.virt.libvirt.driver [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.541 2 DEBUG nova.virt.libvirt.vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:37:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005486733.localdomain',hostname='test',id=2,image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-10-14T08:37:23Z,launched_on='np0005486733.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005486733.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='41187b090f3d4818a32baa37ce8a3991',ramdisk_id='',reservation_id='r-aao7l1tg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='0c25fd0b-0cde-472d-a2dc-9e548eac7c4d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-10-14T09:56:53Z,user_data=None,user_id='9d85e6ce130c46ec855f37147dbb08b4',uuid=88c4e366-b765-47a6-96bf-f7677f2ce67c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.542 2 DEBUG nova.network.os_vif_util [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Converting VIF {"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.542 2 DEBUG nova.network.os_vif_util [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.543 2 DEBUG os_vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.544 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.544 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec9b060-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ec9b060-f4, col_values=(('external_ids', {'iface-id': '3ec9b060-f43d-4698-9c76-6062c70911d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:5e:e5', 'vm-uuid': '88c4e366-b765-47a6-96bf-f7677f2ce67c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.555 2 INFO os_vif [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:5e:e5,bridge_name='br-int',has_traffic_filtering=True,id=3ec9b060-f43d-4698-9c76-6062c70911d5,network=Network(7d0cd696-bdd7-4e70-9512-eb0d23640314),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ec9b060-f4')#033[00m Oct 14 05:57:25 localhost systemd[1]: Starting libvirt secret daemon... Oct 14 05:57:25 localhost systemd[1]: Started libvirt secret daemon. Oct 14 05:57:25 localhost kernel: device tap3ec9b060-f4 entered promiscuous mode Oct 14 05:57:25 localhost NetworkManager[5977]: [1760435845.6591] manager: (tap3ec9b060-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/17) Oct 14 05:57:25 localhost ovn_controller[157396]: 2025-10-14T09:57:25Z|00060|binding|INFO|Claiming lport 3ec9b060-f43d-4698-9c76-6062c70911d5 for this chassis. Oct 14 05:57:25 localhost ovn_controller[157396]: 2025-10-14T09:57:25Z|00061|binding|INFO|3ec9b060-f43d-4698-9c76-6062c70911d5: Claiming fa:16:3e:84:5e:e5 192.168.0.46 Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost systemd-udevd[298727]: Network interface NamePolicy= disabled on kernel command line. Oct 14 05:57:25 localhost ovn_controller[157396]: 2025-10-14T09:57:25Z|00062|binding|INFO|Setting lport 3ec9b060-f43d-4698-9c76-6062c70911d5 ovn-installed in OVS Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost ovn_controller[157396]: 2025-10-14T09:57:25Z|00063|binding|INFO|Setting lport 3ec9b060-f43d-4698-9c76-6062c70911d5 up in Southbound Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.674 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:5e:e5 192.168.0.46'], port_security=['fa:16:3e:84:5e:e5 192.168.0.46'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.46/24', 'neutron:device_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '7', 'neutron:security_group_ids': '313d605c-14d3-4f16-b913-a4f55afa256e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d31a249-7ee5-4da6-a9d1-dab19bbf097c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3ec9b060-f43d-4698-9c76-6062c70911d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:57:25 localhost NetworkManager[5977]: [1760435845.6777] device (tap3ec9b060-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Oct 14 05:57:25 localhost NetworkManager[5977]: [1760435845.6787] device (tap3ec9b060-f4): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.677 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3ec9b060-f43d-4698-9c76-6062c70911d5 in datapath 7d0cd696-bdd7-4e70-9512-eb0d23640314 bound to our chassis#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.679 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 85b3ca3e-8aac-4a2f-8ce5-3542f4a390a8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.680 163055 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d0cd696-bdd7-4e70-9512-eb0d23640314#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.687 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[b15e5477-69df-4189-91d2-fd458d29bce6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.690 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7d0cd696-b1 in ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.691 163159 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7d0cd696-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.691 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[039d8f7e-8fe2-47e0-a32b-84dc5c3bc9b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.692 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[d41b8fe5-d73c-4e7b-94c2-07d993edce61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.708 163190 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8f1b5a-053b-4034-8a12-8553c6007d2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.721 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[fffe5fe7-9365-4d6a-bc30-9cbacc17880e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost systemd-machined[84684]: New machine qemu-2-instance-00000002. Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.752 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[cadd047b-4947-4d9f-a184-6585f6bfca44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.758 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[887e150a-9360-432b-a7c8-00b4ab91846a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost NetworkManager[5977]: [1760435845.7598] manager: (tap7d0cd696-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/18) Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.789 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[e78e9ea0-2b35-418f-9eca-2337f724b850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.793 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[6179b03a-c0b5-4640-9a71-5795cdb54226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost NetworkManager[5977]: [1760435845.8154] device (tap7d0cd696-b0): carrier: link connected Oct 14 05:57:25 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap7d0cd696-b1: link becomes ready Oct 14 05:57:25 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap7d0cd696-b0: link becomes ready Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.821 163170 DEBUG oslo.privsep.daemon [-] privsep: reply[37a8b171-c11f-4388-b378-1a9c11acc600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.842 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7b70a5-8a32-40ea-9d63-606e5d084049]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d0cd696-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:7e:3c:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1186193, 'reachable_time': 34788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298763, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.858 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[e5637325-0144-458e-bbbf-e0cf8134bf42]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:3c60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1186193, 'tstamp': 1186193}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298765, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.877 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[080ad0e8-094e-4715-9ee6-24deecb562f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d0cd696-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:7e:3c:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1186193, 'reachable_time': 34788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 298766, 'error': None, 'target': 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.910 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[b348ad96-f6d8-4ca9-803e-58d98597d696]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.978 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc2f5ef-728e-44cf-b8fd-9ce2648d4c9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.980 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d0cd696-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.981 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.981 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d0cd696-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost kernel: device tap7d0cd696-b0 entered promiscuous mode Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.991 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d0cd696-b0, col_values=(('external_ids', {'iface-id': '25c6586a-239c-451b-aac2-e0a3ee5c3145'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost ovn_controller[157396]: 2025-10-14T09:57:25Z|00064|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 05:57:25 localhost nova_compute[297686]: 2025-10-14 09:57:25.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:25 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:25.996 163055 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:26.000 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[11232471-acfd-417e-bb94-d664d301d600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:26.001 163055 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: global Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: log /dev/log local0 debug Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: log-tag haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314 Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: user root Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: group root Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: maxconn 1024 Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: pidfile /var/lib/neutron/external/pids/7d0cd696-bdd7-4e70-9512-eb0d23640314.pid.haproxy Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: daemon Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: defaults Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: log global Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: mode http Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: option httplog Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: option dontlognull Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: option http-server-close Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: option forwardfor Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: retries 3 Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: timeout http-request 30s Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: timeout connect 30s Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: timeout client 32s Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: timeout server 32s Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: timeout http-keep-alive 30s Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: listen listener Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: bind 169.254.169.254:80 Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: server metadata /var/lib/neutron/metadata_proxy Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: http-request add-header X-OVN-Network-ID 7d0cd696-bdd7-4e70-9512-eb0d23640314 Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Oct 14 05:57:26 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:26.003 163055 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'env', 'PROCESS_TAG=haproxy-7d0cd696-bdd7-4e70-9512-eb0d23640314', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7d0cd696-bdd7-4e70-9512-eb0d23640314.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.130 2 DEBUG nova.compute.manager [req-deeb902c-e6dc-4583-a44e-9bb3cdbe4f3c req-13a21088-ab53-46e7-9989-0476b7d1e0de da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received event network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.133 2 DEBUG oslo_concurrency.lockutils [req-deeb902c-e6dc-4583-a44e-9bb3cdbe4f3c req-13a21088-ab53-46e7-9989-0476b7d1e0de da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.133 2 DEBUG oslo_concurrency.lockutils [req-deeb902c-e6dc-4583-a44e-9bb3cdbe4f3c req-13a21088-ab53-46e7-9989-0476b7d1e0de da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.134 2 DEBUG oslo_concurrency.lockutils [req-deeb902c-e6dc-4583-a44e-9bb3cdbe4f3c req-13a21088-ab53-46e7-9989-0476b7d1e0de da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.134 2 DEBUG nova.compute.manager [req-deeb902c-e6dc-4583-a44e-9bb3cdbe4f3c req-13a21088-ab53-46e7-9989-0476b7d1e0de da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] No waiting events found dispatching network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.134 2 WARNING nova.compute.manager [req-deeb902c-e6dc-4583-a44e-9bb3cdbe4f3c req-13a21088-ab53-46e7-9989-0476b7d1e0de da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received unexpected event network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 for instance with vm_state stopped and task_state powering-on.#033[00m Oct 14 05:57:26 localhost podman[298841]: Oct 14 05:57:26 localhost podman[298841]: 2025-10-14 09:57:26.464488801 +0000 UTC m=+0.091629384 container create cb6758272cd67ba79bd3e5b1ede700d07ed80d5ab32556608cb3d94835e43d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:57:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:57:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:57:26 localhost snmpd[68005]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.495 2 DEBUG nova.compute.manager [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.498 2 DEBUG nova.virt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.499 2 INFO nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] VM Resumed (Lifecycle Event)#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.511 2 INFO nova.virt.libvirt.driver [-] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Instance rebooted successfully.#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.511 2 DEBUG nova.compute.manager [None req-5fe0563a-1db9-40fc-85a9-d7358c604741 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:57:26 localhost podman[298841]: 2025-10-14 09:57:26.420945202 +0000 UTC m=+0.048085835 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Oct 14 05:57:26 localhost systemd[1]: Started libpod-conmon-cb6758272cd67ba79bd3e5b1ede700d07ed80d5ab32556608cb3d94835e43d7c.scope. Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.523 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.541 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Oct 14 05:57:26 localhost systemd[1]: Started libcrun container. Oct 14 05:57:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2c07139809f0ee18cac27c7933d45edb3cc28ee782e6f062a940c4a2b8fbde1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.571 2 DEBUG nova.virt.driver [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.571 2 INFO nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] VM Started (Lifecycle Event)#033[00m Oct 14 05:57:26 localhost podman[298853]: 2025-10-14 09:57:26.593244212 +0000 UTC m=+0.105482064 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.607 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:57:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16388 DF PROTO=TCP SPT=55182 DPT=9102 SEQ=3320539066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B19B89B0000000001030307) Oct 14 05:57:26 localhost nova_compute[297686]: 2025-10-14 09:57:26.612 2 DEBUG nova.compute.manager [None req-f54e50da-17e5-40f4-aab2-053c4dced060 - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Oct 14 05:57:26 localhost podman[298841]: 2025-10-14 09:57:26.620615729 +0000 UTC m=+0.247756302 container init cb6758272cd67ba79bd3e5b1ede700d07ed80d5ab32556608cb3d94835e43d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:57:26 localhost podman[298854]: 2025-10-14 09:57:26.622874171 +0000 UTC m=+0.133325857 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 05:57:26 localhost podman[298841]: 2025-10-14 09:57:26.62979399 +0000 UTC m=+0.256934563 container start cb6758272cd67ba79bd3e5b1ede700d07ed80d5ab32556608cb3d94835e43d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:57:26 localhost podman[298854]: 2025-10-14 09:57:26.631108842 +0000 UTC m=+0.141560588 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 05:57:26 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:57:26 localhost podman[298853]: 2025-10-14 09:57:26.656447375 +0000 UTC m=+0.168685277 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:57:26 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[298874]: [NOTICE] (298898) : New worker (298900) forked Oct 14 05:57:26 localhost neutron-haproxy-ovnmeta-7d0cd696-bdd7-4e70-9512-eb0d23640314[298874]: [NOTICE] (298898) : Loading success. Oct 14 05:57:26 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:57:27 localhost nova_compute[297686]: 2025-10-14 09:57:27.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:28 localhost nova_compute[297686]: 2025-10-14 09:57:28.174 2 DEBUG nova.compute.manager [req-b242dae4-029a-488f-9ca5-4e97dd417966 req-a8fa3fbc-70af-48e3-81e0-dee8378aeacf da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received event network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Oct 14 05:57:28 localhost nova_compute[297686]: 2025-10-14 09:57:28.175 2 DEBUG oslo_concurrency.lockutils [req-b242dae4-029a-488f-9ca5-4e97dd417966 req-a8fa3fbc-70af-48e3-81e0-dee8378aeacf da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:57:28 localhost nova_compute[297686]: 2025-10-14 09:57:28.176 2 DEBUG oslo_concurrency.lockutils [req-b242dae4-029a-488f-9ca5-4e97dd417966 req-a8fa3fbc-70af-48e3-81e0-dee8378aeacf da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:57:28 localhost nova_compute[297686]: 2025-10-14 09:57:28.177 2 DEBUG oslo_concurrency.lockutils [req-b242dae4-029a-488f-9ca5-4e97dd417966 req-a8fa3fbc-70af-48e3-81e0-dee8378aeacf da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:57:28 localhost nova_compute[297686]: 2025-10-14 09:57:28.177 2 DEBUG nova.compute.manager [req-b242dae4-029a-488f-9ca5-4e97dd417966 req-a8fa3fbc-70af-48e3-81e0-dee8378aeacf da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] No waiting events found dispatching network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Oct 14 05:57:28 localhost nova_compute[297686]: 2025-10-14 09:57:28.178 2 WARNING nova.compute.manager [req-b242dae4-029a-488f-9ca5-4e97dd417966 req-a8fa3fbc-70af-48e3-81e0-dee8378aeacf da5827fb8ee54b95a0a3cf62fcdcc49a f669ac1a1893421f91ae49881790edbc - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Received unexpected event network-vif-plugged-3ec9b060-f43d-4698-9c76-6062c70911d5 for instance with vm_state active and task_state None.#033[00m Oct 14 05:57:28 localhost podman[248187]: time="2025-10-14T09:57:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:57:28 localhost podman[248187]: @ - - [14/Oct/2025:09:57:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:57:28 localhost podman[248187]: @ - - [14/Oct/2025:09:57:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18337 "" "Go-http-client/1.1" Oct 14 05:57:30 localhost nova_compute[297686]: 2025-10-14 09:57:30.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16389 DF PROTO=TCP SPT=55182 DPT=9102 SEQ=3320539066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B19C85A0000000001030307) Oct 14 05:57:32 localhost nova_compute[297686]: 2025-10-14 09:57:32.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:35 localhost nova_compute[297686]: 2025-10-14 09:57:35.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:57:35 localhost systemd[1]: tmp-crun.LPkjLD.mount: Deactivated successfully. Oct 14 05:57:35 localhost podman[298909]: 2025-10-14 09:57:35.74694734 +0000 UTC m=+0.085115659 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid) Oct 14 05:57:35 localhost podman[298909]: 2025-10-14 09:57:35.761882813 +0000 UTC m=+0.100051102 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 05:57:35 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:57:37 localhost nova_compute[297686]: 2025-10-14 09:57:37.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:57:37 localhost podman[298926]: 2025-10-14 09:57:37.765896441 +0000 UTC m=+0.099502913 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:57:37 localhost podman[298926]: 2025-10-14 09:57:37.77658632 +0000 UTC m=+0.110192772 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 05:57:37 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:57:38 localhost ovn_controller[157396]: 2025-10-14T09:57:38Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:84:5e:e5 192.168.0.46 Oct 14 05:57:38 localhost openstack_network_exporter[250374]: ERROR 09:57:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:57:38 localhost openstack_network_exporter[250374]: ERROR 09:57:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:57:38 localhost openstack_network_exporter[250374]: ERROR 09:57:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:57:38 localhost openstack_network_exporter[250374]: ERROR 09:57:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:57:38 localhost openstack_network_exporter[250374]: Oct 14 05:57:38 localhost openstack_network_exporter[250374]: ERROR 09:57:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:57:38 localhost openstack_network_exporter[250374]: Oct 14 05:57:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:57:39 localhost systemd[1]: tmp-crun.Db3TWY.mount: Deactivated successfully. Oct 14 05:57:39 localhost podman[298949]: 2025-10-14 09:57:39.749473953 +0000 UTC m=+0.094631680 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 05:57:39 localhost podman[298949]: 2025-10-14 09:57:39.760773441 +0000 UTC m=+0.105931168 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:57:39 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:57:40 localhost nova_compute[297686]: 2025-10-14 09:57:40.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:42 localhost nova_compute[297686]: 2025-10-14 09:57:42.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:43.638 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:43.640 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:43 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:57:43 localhost podman[298969]: 2025-10-14 09:57:43.744403526 +0000 UTC m=+0.078215200 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=) Oct 14 05:57:43 localhost podman[298969]: 2025-10-14 09:57:43.788632997 +0000 UTC m=+0.122444721 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter) Oct 14 05:57:43 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:57:43 localhost podman[298968]: 2025-10-14 09:57:43.790855717 +0000 UTC m=+0.128791892 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:57:43 localhost podman[298968]: 2025-10-14 09:57:43.870645016 +0000 UTC m=+0.208581151 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009) Oct 14 05:57:43 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:45.456 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:45.457 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.8163598#033[00m Oct 14 05:57:45 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39936 [14/Oct/2025:09:57:43.637] listener listener/metadata 0/0/0/1819/1819 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:45.474 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:45.475 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:45 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:45 localhost nova_compute[297686]: 2025-10-14 09:57:45.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:57:45 localhost podman[299011]: 2025-10-14 09:57:45.743623162 +0000 UTC m=+0.088316319 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2) Oct 14 05:57:45 localhost podman[299011]: 2025-10-14 09:57:45.783058072 +0000 UTC m=+0.127751219 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 05:57:45 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:57:45 localhost nova_compute[297686]: 2025-10-14 09:57:45.983 2 DEBUG nova.compute.manager [None req-c9b350f3-1335-45fb-92c5-96903dd9b35e 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Oct 14 05:57:45 localhost nova_compute[297686]: 2025-10-14 09:57:45.988 2 INFO nova.compute.manager [None req-c9b350f3-1335-45fb-92c5-96903dd9b35e 9d85e6ce130c46ec855f37147dbb08b4 41187b090f3d4818a32baa37ce8a3991 - - default default] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Retrieving diagnostics#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.110 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 1.6343222#033[00m Oct 14 05:57:47 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39946 [14/Oct/2025:09:57:45.474] listener listener/metadata 0/0/0/1635/1635 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.126 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.127 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.266 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.266 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.1396623#033[00m Oct 14 05:57:47 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39948 [14/Oct/2025:09:57:47.125] listener listener/metadata 0/0/0/141/141 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.274 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.275 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.429 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.429 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.1546555#033[00m Oct 14 05:57:47 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39964 [14/Oct/2025:09:57:47.273] listener listener/metadata 0/0/0/155/155 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.437 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.438 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:47 localhost nova_compute[297686]: 2025-10-14 09:57:47.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.891 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.892 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.4540217#033[00m Oct 14 05:57:47 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39974 [14/Oct/2025:09:57:47.436] listener listener/metadata 0/0/0/455/455 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.899 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:47.901 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:47 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.128 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.129 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 148 time: 0.2286742#033[00m Oct 14 05:57:48 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39980 [14/Oct/2025:09:57:47.899] listener listener/metadata 0/0/0/230/230 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.137 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.138 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.337 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.339 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.2008083#033[00m Oct 14 05:57:48 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:39996 [14/Oct/2025:09:57:48.136] listener listener/metadata 0/0/0/202/202 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.347 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.348 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.460 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:48 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40002 [14/Oct/2025:09:57:48.346] listener listener/metadata 0/0/0/114/114 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.461 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.1134231#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.468 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.469 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.634 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.635 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.1660132#033[00m Oct 14 05:57:48 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40016 [14/Oct/2025:09:57:48.467] listener listener/metadata 0/0/0/167/167 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.661 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:48.662 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:48 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.056 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.058 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 05:57:49 localhost nova_compute[297686]: 2025-10-14 09:57:49.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:49 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40018 [14/Oct/2025:09:57:48.660] listener listener/metadata 0/0/0/454/454 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.115 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.4531450#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.130 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.131 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.320 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.321 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.1898005#033[00m Oct 14 05:57:49 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40034 [14/Oct/2025:09:57:49.130] listener listener/metadata 0/0/0/191/191 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.331 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.332 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.465 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.466 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.1336513#033[00m Oct 14 05:57:49 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40046 [14/Oct/2025:09:57:49.331] listener listener/metadata 0/0/0/134/134 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.472 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.473 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.632 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.633 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.1597190#033[00m Oct 14 05:57:49 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40052 [14/Oct/2025:09:57:49.472] listener listener/metadata 0/0/0/161/161 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.639 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:49.640 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:49 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.081 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.083 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.4426503#033[00m Oct 14 05:57:50 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40068 [14/Oct/2025:09:57:49.639] listener listener/metadata 0/0/0/444/444 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.088 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.089 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.287 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.288 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.1990857#033[00m Oct 14 05:57:50 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40076 [14/Oct/2025:09:57:50.087] listener listener/metadata 0/0/0/200/200 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.296 163154 DEBUG eventlet.wsgi.server [-] (163154) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.297 163154 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Accept: */*#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Connection: close#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Content-Type: text/plain#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: Host: 169.254.169.254#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: User-Agent: curl/7.84.0#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: X-Forwarded-For: 192.168.0.46#015 Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: X-Ovn-Network-Id: 7d0cd696-bdd7-4e70-9512-eb0d23640314 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Oct 14 05:57:50 localhost haproxy-metadata-proxy-7d0cd696-bdd7-4e70-9512-eb0d23640314[298900]: 192.168.0.46:40080 [14/Oct/2025:09:57:50.295] listener listener/metadata 0/0/0/160/160 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.456 163154 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Oct 14 05:57:50 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:50.456 163154 INFO eventlet.wsgi.server [-] 192.168.0.46, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.1592517#033[00m Oct 14 05:57:50 localhost nova_compute[297686]: 2025-10-14 09:57:50.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:51 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:51.061 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 05:57:52 localhost nova_compute[297686]: 2025-10-14 09:57:52.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8500 DF PROTO=TCP SPT=60448 DPT=9102 SEQ=416576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1A21A40000000001030307) Oct 14 05:57:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8501 DF PROTO=TCP SPT=60448 DPT=9102 SEQ=416576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1A259A0000000001030307) Oct 14 05:57:55 localhost nova_compute[297686]: 2025-10-14 09:57:55.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:55 localhost ovn_controller[157396]: 2025-10-14T09:57:55Z|00065|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Oct 14 05:57:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8502 DF PROTO=TCP SPT=60448 DPT=9102 SEQ=416576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1A2D9A0000000001030307) Oct 14 05:57:57 localhost nova_compute[297686]: 2025-10-14 09:57:57.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:57:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:57:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:57:57 localhost podman[299116]: 2025-10-14 09:57:57.744278692 +0000 UTC m=+0.083572890 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:57:57 localhost podman[299116]: 2025-10-14 09:57:57.754229868 +0000 UTC m=+0.093524066 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:57:57 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:57:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:57.766 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:57:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:57.767 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:57:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:57:57.768 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:57:57 localhost podman[299117]: 2025-10-14 09:57:57.851633104 +0000 UTC m=+0.190057005 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:57:57 localhost podman[299117]: 2025-10-14 09:57:57.856363644 +0000 UTC m=+0.194787515 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3) Oct 14 05:57:57 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:57:58 localhost podman[248187]: time="2025-10-14T09:57:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:57:58 localhost podman[248187]: @ - - [14/Oct/2025:09:57:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:57:58 localhost podman[248187]: @ - - [14/Oct/2025:09:57:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18357 "" "Go-http-client/1.1" Oct 14 05:58:00 localhost nova_compute[297686]: 2025-10-14 09:58:00.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8503 DF PROTO=TCP SPT=60448 DPT=9102 SEQ=416576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1A3D5A0000000001030307) Oct 14 05:58:02 localhost nova_compute[297686]: 2025-10-14 09:58:02.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:05 localhost nova_compute[297686]: 2025-10-14 09:58:05.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:58:06 localhost podman[299158]: 2025-10-14 09:58:06.736382837 +0000 UTC m=+0.080838083 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20251009) Oct 14 05:58:06 localhost podman[299158]: 2025-10-14 09:58:06.750156884 +0000 UTC m=+0.094612180 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:58:06 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:58:07 localhost nova_compute[297686]: 2025-10-14 09:58:07.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:08 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Oct 14 05:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:58:08 localhost systemd[1]: tmp-crun.5Cr6L7.mount: Deactivated successfully. Oct 14 05:58:08 localhost podman[299178]: 2025-10-14 09:58:08.7389475 +0000 UTC m=+0.080636997 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 05:58:08 localhost podman[299178]: 2025-10-14 09:58:08.775320143 +0000 UTC m=+0.117009670 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:58:08 localhost openstack_network_exporter[250374]: ERROR 09:58:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:58:08 localhost openstack_network_exporter[250374]: ERROR 09:58:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:58:08 localhost openstack_network_exporter[250374]: ERROR 09:58:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:58:08 localhost openstack_network_exporter[250374]: ERROR 09:58:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:58:08 localhost openstack_network_exporter[250374]: Oct 14 05:58:08 localhost openstack_network_exporter[250374]: ERROR 09:58:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:58:08 localhost openstack_network_exporter[250374]: Oct 14 05:58:08 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:58:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:58:10 localhost podman[299201]: 2025-10-14 09:58:10.431950222 +0000 UTC m=+0.089333142 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 05:58:10 localhost podman[299201]: 2025-10-14 09:58:10.445020717 +0000 UTC m=+0.102403647 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 05:58:10 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:58:10 localhost nova_compute[297686]: 2025-10-14 09:58:10.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:12 localhost nova_compute[297686]: 2025-10-14 09:58:12.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:58:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:58:14 localhost podman[299220]: 2025-10-14 09:58:14.731710994 +0000 UTC m=+0.071548228 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:58:14 localhost podman[299220]: 2025-10-14 09:58:14.791576071 +0000 UTC m=+0.131413345 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Oct 14 05:58:14 localhost systemd[1]: tmp-crun.vsUt0o.mount: Deactivated successfully. Oct 14 05:58:14 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:58:14 localhost podman[299221]: 2025-10-14 09:58:14.813727493 +0000 UTC m=+0.148022162 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, version=9.6, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64) Oct 14 05:58:14 localhost podman[299221]: 2025-10-14 09:58:14.831197727 +0000 UTC m=+0.165492426 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible) Oct 14 05:58:14 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:58:15 localhost nova_compute[297686]: 2025-10-14 09:58:15.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:15 localhost nova_compute[297686]: 2025-10-14 09:58:15.852 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:15 localhost nova_compute[297686]: 2025-10-14 09:58:15.853 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:15 localhost nova_compute[297686]: 2025-10-14 09:58:15.894 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:15 localhost nova_compute[297686]: 2025-10-14 09:58:15.895 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:58:15 localhost nova_compute[297686]: 2025-10-14 09:58:15.895 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.076 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.077 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.077 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.077 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.531 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.579 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.579 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.580 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.580 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.581 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.581 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.581 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.581 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.582 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.582 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.601 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.601 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.601 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.602 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:58:16 localhost nova_compute[297686]: 2025-10-14 09:58:16.602 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:58:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:58:16 localhost systemd[1]: tmp-crun.aFziNg.mount: Deactivated successfully. Oct 14 05:58:16 localhost podman[299269]: 2025-10-14 09:58:16.758987891 +0000 UTC m=+0.099039311 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:58:16 localhost podman[299269]: 2025-10-14 09:58:16.777055693 +0000 UTC m=+0.117107153 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=edpm) Oct 14 05:58:16 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.071 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.147 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.148 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.369 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.371 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11951MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.371 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.371 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.464 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.464 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.465 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.513 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.960 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.966 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:58:17 localhost nova_compute[297686]: 2025-10-14 09:58:17.996 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:58:18 localhost nova_compute[297686]: 2025-10-14 09:58:18.018 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:58:18 localhost nova_compute[297686]: 2025-10-14 09:58:18.018 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:58:20 localhost nova_compute[297686]: 2025-10-14 09:58:20.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:22 localhost nova_compute[297686]: 2025-10-14 09:58:22.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:23 localhost snmpd[68005]: empty variable list in _query Oct 14 05:58:23 localhost snmpd[68005]: empty variable list in _query Oct 14 05:58:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1463 DF PROTO=TCP SPT=52272 DPT=9102 SEQ=4262702922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1A96D50000000001030307) Oct 14 05:58:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1464 DF PROTO=TCP SPT=52272 DPT=9102 SEQ=4262702922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1A9ADA0000000001030307) Oct 14 05:58:25 localhost nova_compute[297686]: 2025-10-14 09:58:25.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1465 DF PROTO=TCP SPT=52272 DPT=9102 SEQ=4262702922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1AA2DA0000000001030307) Oct 14 05:58:27 localhost nova_compute[297686]: 2025-10-14 09:58:27.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:28 localhost podman[248187]: time="2025-10-14T09:58:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:58:28 localhost podman[248187]: @ - - [14/Oct/2025:09:58:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:58:28 localhost podman[248187]: @ - - [14/Oct/2025:09:58:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18349 "" "Go-http-client/1.1" Oct 14 05:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:58:28 localhost podman[299333]: 2025-10-14 09:58:28.72676744 +0000 UTC m=+0.067635494 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 05:58:28 localhost podman[299333]: 2025-10-14 09:58:28.738404039 +0000 UTC m=+0.079272073 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 05:58:28 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:58:28 localhost systemd[1]: tmp-crun.GWVCdD.mount: Deactivated successfully. Oct 14 05:58:28 localhost podman[299334]: 2025-10-14 09:58:28.793434092 +0000 UTC m=+0.130555248 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:58:28 localhost podman[299334]: 2025-10-14 09:58:28.802029065 +0000 UTC m=+0.139150221 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:58:28 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:58:30 localhost nova_compute[297686]: 2025-10-14 09:58:30.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1466 DF PROTO=TCP SPT=52272 DPT=9102 SEQ=4262702922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1AB29A0000000001030307) Oct 14 05:58:32 localhost nova_compute[297686]: 2025-10-14 09:58:32.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:35 localhost nova_compute[297686]: 2025-10-14 09:58:35.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:58:37 localhost podman[299374]: 2025-10-14 09:58:37.72464311 +0000 UTC m=+0.071751965 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 05:58:37 localhost podman[299374]: 2025-10-14 09:58:37.758928256 +0000 UTC m=+0.106037111 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 05:58:37 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:58:37 localhost nova_compute[297686]: 2025-10-14 09:58:37.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:38 localhost openstack_network_exporter[250374]: ERROR 09:58:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:58:38 localhost openstack_network_exporter[250374]: ERROR 09:58:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:58:38 localhost openstack_network_exporter[250374]: ERROR 09:58:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:58:38 localhost openstack_network_exporter[250374]: ERROR 09:58:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:58:38 localhost openstack_network_exporter[250374]: Oct 14 05:58:38 localhost openstack_network_exporter[250374]: ERROR 09:58:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:58:38 localhost openstack_network_exporter[250374]: Oct 14 05:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:58:39 localhost systemd[1]: tmp-crun.N3y6QS.mount: Deactivated successfully. Oct 14 05:58:39 localhost podman[299395]: 2025-10-14 09:58:39.747888418 +0000 UTC m=+0.084586282 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 05:58:39 localhost podman[299395]: 2025-10-14 09:58:39.783236427 +0000 UTC m=+0.119934361 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:58:39 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:58:40 localhost nova_compute[297686]: 2025-10-14 09:58:40.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:58:40 localhost systemd[1]: tmp-crun.ucOLQI.mount: Deactivated successfully. Oct 14 05:58:40 localhost podman[299419]: 2025-10-14 09:58:40.747048381 +0000 UTC m=+0.088148114 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_id=multipathd, tcib_managed=true) Oct 14 05:58:40 localhost podman[299419]: 2025-10-14 09:58:40.783404804 +0000 UTC m=+0.124504477 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd) Oct 14 05:58:40 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:58:42 localhost nova_compute[297686]: 2025-10-14 09:58:42.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:45 localhost nova_compute[297686]: 2025-10-14 09:58:45.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:58:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:58:45 localhost systemd[1]: tmp-crun.cnPhdB.mount: Deactivated successfully. Oct 14 05:58:45 localhost podman[299439]: 2025-10-14 09:58:45.734343664 +0000 UTC m=+0.075118262 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter) Oct 14 05:58:45 localhost podman[299439]: 2025-10-14 09:58:45.748003587 +0000 UTC m=+0.088778215 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 05:58:45 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:58:45 localhost podman[299438]: 2025-10-14 09:58:45.827601519 +0000 UTC m=+0.173307844 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 05:58:45 localhost podman[299438]: 2025-10-14 09:58:45.863966171 +0000 UTC m=+0.209672526 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:58:45 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:58:47 localhost podman[299482]: 2025-10-14 09:58:47.735723169 +0000 UTC m=+0.081215365 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 05:58:47 localhost podman[299482]: 2025-10-14 09:58:47.745181159 +0000 UTC m=+0.090673345 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 05:58:47 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:58:47 localhost nova_compute[297686]: 2025-10-14 09:58:47.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.816 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.817 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.821 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ffb713d-bdcc-42a3-9ec6-e516e9b30de9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.817885', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '61f94b8e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': 'd8bbc72c73e8038b2ee91e9aacf4507d4d0e27748582207e74836b2caba099c7'}]}, 'timestamp': '2025-10-14 09:58:49.822658', '_unique_id': 'f90a50685df44a5c8b1aea8870152325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.824 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.825 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.826 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.826 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6745c30-b8ee-4714-b674-f896929b31dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.826239', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '61f9ee22-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': '71616ed9e4f72ba7a1558e6865d703125e6be2d5ab3e3d3da4bd9b33bd219ddb'}]}, 'timestamp': '2025-10-14 09:58:49.826738', '_unique_id': '3f08d257aaa6490e9940812d6c8ab320'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.827 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.829 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.850 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.850 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0378129-2807-4b56-bf3e-5e09dd436402', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.829301', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61fda2d8-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '1b9d7ae8c6572da554987fc7d851c8457d22d31cdf7dbfff3028833a3f7a7f57'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.829301', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61fdb3d6-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': 'd410ce98400752a7a7866eb59b56e980d786f1c8fbb00b011082842bd49b1ac0'}]}, 'timestamp': '2025-10-14 09:58:49.851391', '_unique_id': 'bed5830f617548538a7514f3b7bfedbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.853 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.853 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72772c66-768c-40ab-8a59-fee170c5879e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.853788', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '61fe2294-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': '7042385ec444a61047503089f9b48fc2d1def8eefbec223c5d4dc99f1c75c5ab'}]}, 'timestamp': '2025-10-14 09:58:49.854254', '_unique_id': 'b272e60017ce497e94bd74ddc02669c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.856 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.874 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 11250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2346f74c-cce0-4cfd-a709-0a6c052a35cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11250000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:58:49.856355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '62015b76-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.067483914, 'message_signature': 'cb5e4c7218662b2306976a535f3c77218018b469f47b743be61006cc3e3697f2'}]}, 'timestamp': '2025-10-14 09:58:49.875388', '_unique_id': '419fa88375d84ab4a8bc25bfe5aa3043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.877 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4430f9dc-50fb-420c-aed5-fe271a2c12cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.877595', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6201c606-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': 'fb3ba808e79a56a7690c8579b93fb58e4733e1cfb35749e3da531d9ad58c9a98'}]}, 'timestamp': '2025-10-14 09:58:49.878098', '_unique_id': '42dff44b6c2e4f1b888b5f1dbdcfc8a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.880 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f901dff5-2a36-4ed7-b53c-2a6b0feafffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.880198', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '62022a56-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': '08edcaf60be3a8f96fc7494e8ce96d34abcc335a7d97026ddf9745db24fe63ae'}]}, 'timestamp': '2025-10-14 09:58:49.880664', '_unique_id': '5ae5bd85717a47b2a990f1a569e03e3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.893 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5c0c3ae-4c72-4977-9056-c23d3e2ce93d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.882803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62043a6c-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.075501758, 'message_signature': '4fb0e94cf7bf6b6b5eda365fe657a34b964a85e57f39faf6107ad8f609ba3992'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.882803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62045894-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.075501758, 'message_signature': 'f6e4734f31ba3bbddd099d8e190aa6ebb4d33c1c392e294edb65312f5b02e20f'}]}, 'timestamp': '2025-10-14 09:58:49.894929', '_unique_id': '79538301841e47148f3ba858d740c987'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '285c948a-4049-49de-ab1a-b777fd424c3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.897192', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6204c216-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': '5356c28bb70a97ec770d1edaffe5d54178ef27ab9f05a764998699cf6105968c'}]}, 'timestamp': '2025-10-14 09:58:49.897655', '_unique_id': '2852cacc8c764e4baf1087985bb3b1f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.899 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bca7bfe-c521-4ed3-b9ba-ed6695c57c4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T09:58:49.900063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '620531b0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.067483914, 'message_signature': '08db608f25faba267d8f26914b7f4dea7ed87c17b1e1df03a8cd63f683878afe'}]}, 'timestamp': '2025-10-14 09:58:49.900495', '_unique_id': 'acad678e8eec4df7a6992e16d8b4b538'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.903 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d6f58ad-dbc0-4e24-ada0-e6f257903345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.902591', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '620595b0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.075501758, 'message_signature': '89c50bc6a93cd8e6227f6a09beb80d2ff435542e9621045d099711a871b19d49'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.902591', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6205a55a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.075501758, 'message_signature': '2b93ebad0c768a7c38f8067cfdb252c8b67b862a53cc68dbe064287d23af251e'}]}, 'timestamp': '2025-10-14 09:58:49.903440', '_unique_id': 'e0ddc64a2913451190b3ca62e12fbc76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.905 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87b65b14-761b-44f5-84ef-5c4f9b167967', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.905579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62060ac2-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '56e88af40702140282a5de414f593d6e7507e41054c95e01ec4e9dd475547af8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.905579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62061a80-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '5e425dd99ae90c0e3bcbc493ce037abb5e212bca255371838f2002fb679b2397'}]}, 'timestamp': '2025-10-14 09:58:49.906438', '_unique_id': 'd83c325bcc08436f8dfe5be932df55eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.908 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb9ad1bd-87b2-40a5-bc5f-62a24cb0b074', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.908553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62067ec6-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '66821df244c0ad2299309d10378398a3fa8f5dd50990cb20cc816e9a4b313d3d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.908553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62068fce-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '12a80cd90a5821865179028f2b1b89467cad2c364ec3ba89bc37f5c67ffd9b95'}]}, 'timestamp': '2025-10-14 09:58:49.909446', '_unique_id': '679fa8a7b3c04ce1b8b25517584b00c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '101721c4-7201-4c24-ac63-450dcd54253a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.911549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6206f4dc-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '992b8b3dae32443552e1b8d9bef2ef421e59aac923a45431b1e6a0ff2f42e8ce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.911549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '620704a4-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': 'ec358bf136ff32a3ddc6593bd6456f45e6c642aa0fde98ad61be8135d9d757b5'}]}, 'timestamp': '2025-10-14 09:58:49.912433', '_unique_id': '77161c750c8a420da59feaa76fdb5fd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c076c6cc-069d-4a42-a393-ddeb4a8a4118', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.914558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62076926-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.075501758, 'message_signature': '13e38bdb4925f19bb457f9a1cb1c417be57f57dd56da98fc82f5a810fad70a45'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.914558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '620778da-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.075501758, 'message_signature': 'accb2fea12953a1f0fde4d3f67e41232a1997d319c3d9b2d0eec651b58575443'}]}, 'timestamp': '2025-10-14 09:58:49.915409', '_unique_id': '92a8aca9d3774eed81aae8e76033095c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb998180-78f5-47b7-b0f2-689325bb06cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.917535', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6207dd7a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': 'db0e5d15372d679b4792b9aa289464fd30097120233033af45441219f25f9e7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.917535', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6207ed24-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '725aec0f77619a3c9642cfa5971c7fe20a493992e0efb5d3c83e0fd55f1a8977'}]}, 'timestamp': '2025-10-14 09:58:49.918387', '_unique_id': 'af462322d6a94deb8b47ae69475a238b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0f65356-ff02-48f9-be1a-a6d4c96bf229', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.920512', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '620851ec-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': '0fd54f5465bc997720e996adbd3302c41478a2255d24307336bf60b8b454b21f'}]}, 'timestamp': '2025-10-14 09:58:49.921022', '_unique_id': 'c028c38664e645c2b5b2f55fe1357fb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cf4473e-70cc-4bfe-93a0-0c38f433842e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T09:58:49.923078', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6208b4b6-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': '1736f51d0ab3233820e18a7094e507c718024179de2563d8efbc59b62dfb8774'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T09:58:49.923078', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6208c582-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.022012873, 'message_signature': 'f8a0b0e22a55905f416357868e9d4be0b2bc88f2548e3c98778560d2239515a6'}]}, 'timestamp': '2025-10-14 09:58:49.923928', '_unique_id': '97dd2e3f662d4f50adf05b5f95165a2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a29ec124-80a7-4e58-a308-124993d17c38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.926150', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '62092d1a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': 'b425431f871942882b5d5ba5716c49ad2fbdf12236d3704ebba56643be837586'}]}, 'timestamp': '2025-10-14 09:58:49.926608', '_unique_id': '121e6305d1fd48908592591fb36495a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '482bab70-9623-4fd0-aa7a-a0ebf8084029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.928655', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '62099020-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': '7ee0e6aca0c8d77d7f842bc664e2c58529eb537cae9d527fa185c8638e96fae3'}]}, 'timestamp': '2025-10-14 09:58:49.929169', '_unique_id': '60b8205bca7f4714a8f5314e2b5870b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a74d4747-ed3d-4c23-9d7f-3ab5d401d3d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T09:58:49.930554', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '6209d6a2-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 11946.01059037, 'message_signature': 'a13d93297d3a2c89ee78d180b5d7f21025cb8b5c2bf5fbada43d0d48fbb2de5b'}]}, 'timestamp': '2025-10-14 09:58:49.930872', '_unique_id': '7c0793cd8fbe4b538a25953b8cb26136'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 05:58:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 09:58:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 05:58:50 localhost nova_compute[297686]: 2025-10-14 09:58:50.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:52 localhost nova_compute[297686]: 2025-10-14 09:58:52.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12891 DF PROTO=TCP SPT=44720 DPT=9102 SEQ=2670847253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B0C050000000001030307) Oct 14 05:58:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12892 DF PROTO=TCP SPT=44720 DPT=9102 SEQ=2670847253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B101A0000000001030307) Oct 14 05:58:55 localhost nova_compute[297686]: 2025-10-14 09:58:55.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12893 DF PROTO=TCP SPT=44720 DPT=9102 SEQ=2670847253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B181A0000000001030307) Oct 14 05:58:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:58:57.767 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:58:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:58:57.767 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:58:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:58:57.769 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:58:57 localhost nova_compute[297686]: 2025-10-14 09:58:57.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:58:58 localhost podman[248187]: time="2025-10-14T09:58:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:58:58 localhost podman[248187]: @ - - [14/Oct/2025:09:58:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:58:58 localhost podman[248187]: @ - - [14/Oct/2025:09:58:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18358 "" "Go-http-client/1.1" Oct 14 05:58:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:58:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:58:59 localhost podman[299588]: 2025-10-14 09:58:59.75790491 +0000 UTC m=+0.092356678 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:58:59 localhost podman[299587]: 2025-10-14 09:58:59.728801938 +0000 UTC m=+0.068221263 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:58:59 localhost podman[299588]: 2025-10-14 09:58:59.792354282 +0000 UTC m=+0.126806040 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 05:58:59 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:58:59 localhost podman[299587]: 2025-10-14 09:58:59.814153563 +0000 UTC m=+0.153572858 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:58:59 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:59:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12894 DF PROTO=TCP SPT=44720 DPT=9102 SEQ=2670847253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B27DA0000000001030307) Oct 14 05:59:00 localhost nova_compute[297686]: 2025-10-14 09:59:00.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:02 localhost nova_compute[297686]: 2025-10-14 09:59:02.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:05 localhost nova_compute[297686]: 2025-10-14 09:59:05.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:07 localhost nova_compute[297686]: 2025-10-14 09:59:07.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:59:08 localhost podman[299628]: 2025-10-14 09:59:08.741356023 +0000 UTC m=+0.079605743 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 05:59:08 localhost openstack_network_exporter[250374]: ERROR 09:59:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:59:08 localhost openstack_network_exporter[250374]: ERROR 09:59:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:59:08 localhost openstack_network_exporter[250374]: ERROR 09:59:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:59:08 localhost podman[299628]: 2025-10-14 09:59:08.777752227 +0000 UTC m=+0.116001947 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:59:08 localhost openstack_network_exporter[250374]: ERROR 09:59:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:59:08 localhost openstack_network_exporter[250374]: Oct 14 05:59:08 localhost openstack_network_exporter[250374]: ERROR 09:59:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:59:08 localhost openstack_network_exporter[250374]: Oct 14 05:59:08 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:59:10 localhost podman[299647]: 2025-10-14 09:59:10.418177073 +0000 UTC m=+0.073868281 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:59:10 localhost podman[299647]: 2025-10-14 09:59:10.430145223 +0000 UTC m=+0.085836471 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:59:10 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:59:10 localhost nova_compute[297686]: 2025-10-14 09:59:10.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:10 localhost sshd[299671]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:59:10 localhost systemd-logind[760]: New session 64 of user zuul. Oct 14 05:59:10 localhost systemd[1]: Started Session 64 of User zuul. Oct 14 05:59:10 localhost podman[299673]: 2025-10-14 09:59:10.964855188 +0000 UTC m=+0.089359553 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 05:59:10 localhost podman[299673]: 2025-10-14 09:59:10.975484135 +0000 UTC m=+0.099988490 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251009, managed_by=edpm_ansible) Oct 14 05:59:10 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:59:11 localhost python3[299711]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 05:59:11 localhost subscription-manager[299712]: Unregistered machine with identity: 2fbb0dcc-d71d-422c-902e-d4c1db889f51 Oct 14 05:59:11 localhost systemd-journald[47488]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Oct 14 05:59:11 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 05:59:11 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:59:11 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 05:59:13 localhost nova_compute[297686]: 2025-10-14 09:59:13.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:15 localhost nova_compute[297686]: 2025-10-14 09:59:15.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:59:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:59:16 localhost podman[299715]: 2025-10-14 09:59:16.760277713 +0000 UTC m=+0.090228925 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 05:59:16 localhost podman[299716]: 2025-10-14 09:59:16.85317954 +0000 UTC m=+0.182605686 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 05:59:16 localhost podman[299715]: 2025-10-14 09:59:16.857857913 +0000 UTC m=+0.187809165 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Oct 14 05:59:16 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:59:16 localhost podman[299716]: 2025-10-14 09:59:16.893308235 +0000 UTC m=+0.222734361 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public) Oct 14 05:59:16 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:59:18 localhost nova_compute[297686]: 2025-10-14 09:59:18.019 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:18 localhost nova_compute[297686]: 2025-10-14 09:59:18.020 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:18 localhost nova_compute[297686]: 2025-10-14 09:59:18.020 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 05:59:18 localhost nova_compute[297686]: 2025-10-14 09:59:18.020 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 05:59:18 localhost nova_compute[297686]: 2025-10-14 09:59:18.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:59:18 localhost podman[299757]: 2025-10-14 09:59:18.739015499 +0000 UTC m=+0.079435386 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Oct 14 05:59:18 localhost podman[299757]: 2025-10-14 09:59:18.751194611 +0000 UTC m=+0.091614498 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 05:59:18 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:59:19 localhost nova_compute[297686]: 2025-10-14 09:59:19.102 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 05:59:19 localhost nova_compute[297686]: 2025-10-14 09:59:19.103 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 05:59:19 localhost nova_compute[297686]: 2025-10-14 09:59:19.103 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 05:59:19 localhost nova_compute[297686]: 2025-10-14 09:59:19.104 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 05:59:20 localhost nova_compute[297686]: 2025-10-14 09:59:20.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.141 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.160 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.160 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.161 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.161 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.162 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.162 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.163 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.163 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.164 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.164 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.187 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.187 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.188 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.188 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.189 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.647 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.709 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.709 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.915 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.916 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11950MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.917 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.917 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.994 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.994 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 05:59:22 localhost nova_compute[297686]: 2025-10-14 09:59:22.995 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.035 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20102 DF PROTO=TCP SPT=58620 DPT=9102 SEQ=4182440142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B81340000000001030307) Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.515 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.520 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.536 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.537 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 05:59:23 localhost nova_compute[297686]: 2025-10-14 09:59:23.537 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:59:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20103 DF PROTO=TCP SPT=58620 DPT=9102 SEQ=4182440142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B855A0000000001030307) Oct 14 05:59:25 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Oct 14 05:59:25 localhost nova_compute[297686]: 2025-10-14 09:59:25.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20104 DF PROTO=TCP SPT=58620 DPT=9102 SEQ=4182440142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B8D5A0000000001030307) Oct 14 05:59:28 localhost nova_compute[297686]: 2025-10-14 09:59:28.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:28 localhost podman[248187]: time="2025-10-14T09:59:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:59:28 localhost podman[248187]: @ - - [14/Oct/2025:09:59:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:59:28 localhost podman[248187]: @ - - [14/Oct/2025:09:59:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18361 "" "Go-http-client/1.1" Oct 14 05:59:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:6c:d8:2b MACDST=fa:16:3e:c0:7d:39 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20105 DF PROTO=TCP SPT=58620 DPT=9102 SEQ=4182440142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A2B1B9D1A0000000001030307) Oct 14 05:59:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 05:59:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 05:59:30 localhost systemd[1]: tmp-crun.NBwmFT.mount: Deactivated successfully. Oct 14 05:59:30 localhost podman[299822]: 2025-10-14 09:59:30.746740761 +0000 UTC m=+0.084701168 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 05:59:30 localhost podman[299822]: 2025-10-14 09:59:30.755081596 +0000 UTC m=+0.093042023 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent) Oct 14 05:59:30 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 05:59:30 localhost nova_compute[297686]: 2025-10-14 09:59:30.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:30 localhost podman[299821]: 2025-10-14 09:59:30.849460957 +0000 UTC m=+0.186302549 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 05:59:30 localhost podman[299821]: 2025-10-14 09:59:30.857721749 +0000 UTC m=+0.194563391 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 05:59:30 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 05:59:33 localhost nova_compute[297686]: 2025-10-14 09:59:33.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:35 localhost nova_compute[297686]: 2025-10-14 09:59:35.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:38 localhost nova_compute[297686]: 2025-10-14 09:59:38.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:38 localhost openstack_network_exporter[250374]: ERROR 09:59:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 05:59:38 localhost openstack_network_exporter[250374]: ERROR 09:59:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:59:38 localhost openstack_network_exporter[250374]: ERROR 09:59:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 05:59:38 localhost openstack_network_exporter[250374]: ERROR 09:59:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 05:59:38 localhost openstack_network_exporter[250374]: Oct 14 05:59:38 localhost openstack_network_exporter[250374]: ERROR 09:59:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 05:59:38 localhost openstack_network_exporter[250374]: Oct 14 05:59:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 05:59:39 localhost systemd[1]: tmp-crun.gDkn2R.mount: Deactivated successfully. Oct 14 05:59:39 localhost podman[299881]: 2025-10-14 09:59:39.064898134 +0000 UTC m=+0.076897350 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 05:59:39 localhost podman[299881]: 2025-10-14 09:59:39.072654961 +0000 UTC m=+0.084654227 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid) Oct 14 05:59:39 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 05:59:39 localhost podman[299986]: 2025-10-14 09:59:39.850745947 +0000 UTC m=+0.093187116 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, RELEASE=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 05:59:39 localhost podman[299986]: 2025-10-14 09:59:39.958208919 +0000 UTC m=+0.200650068 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, name=rhceph, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph) Oct 14 05:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 05:59:40 localhost podman[300053]: 2025-10-14 09:59:40.753226243 +0000 UTC m=+0.083329806 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:59:40 localhost podman[300053]: 2025-10-14 09:59:40.76135508 +0000 UTC m=+0.091458663 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 05:59:40 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 05:59:40 localhost nova_compute[297686]: 2025-10-14 09:59:40.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 05:59:41 localhost podman[300112]: 2025-10-14 09:59:41.387463377 +0000 UTC m=+0.068334587 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd) Oct 14 05:59:41 localhost podman[300112]: 2025-10-14 09:59:41.404037153 +0000 UTC m=+0.084908413 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251009) Oct 14 05:59:41 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 05:59:43 localhost nova_compute[297686]: 2025-10-14 09:59:43.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:45 localhost nova_compute[297686]: 2025-10-14 09:59:45.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 05:59:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 05:59:47 localhost systemd[1]: tmp-crun.BlO9nu.mount: Deactivated successfully. Oct 14 05:59:47 localhost podman[300150]: 2025-10-14 09:59:47.74577888 +0000 UTC m=+0.083498830 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7) Oct 14 05:59:47 localhost systemd[1]: tmp-crun.rsuKxh.mount: Deactivated successfully. Oct 14 05:59:47 localhost podman[300149]: 2025-10-14 09:59:47.796426156 +0000 UTC m=+0.136806378 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 05:59:47 localhost podman[300150]: 2025-10-14 09:59:47.814525099 +0000 UTC m=+0.152245039 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git) Oct 14 05:59:47 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 05:59:47 localhost podman[300149]: 2025-10-14 09:59:47.865184085 +0000 UTC m=+0.205564327 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 05:59:47 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 05:59:48 localhost nova_compute[297686]: 2025-10-14 09:59:48.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:48 localhost sshd[300195]: main: sshd: ssh-rsa algorithm is disabled Oct 14 05:59:48 localhost systemd[1]: Created slice User Slice of UID 1003. Oct 14 05:59:48 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Oct 14 05:59:48 localhost systemd-logind[760]: New session 65 of user tripleo-admin. Oct 14 05:59:48 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Oct 14 05:59:48 localhost systemd[1]: Starting User Manager for UID 1003... Oct 14 05:59:48 localhost systemd[300199]: Queued start job for default target Main User Target. Oct 14 05:59:48 localhost systemd[300199]: Created slice User Application Slice. Oct 14 05:59:48 localhost systemd[300199]: Started Mark boot as successful after the user session has run 2 minutes. Oct 14 05:59:48 localhost systemd[300199]: Started Daily Cleanup of User's Temporary Directories. Oct 14 05:59:48 localhost systemd[300199]: Reached target Paths. Oct 14 05:59:48 localhost systemd[300199]: Reached target Timers. Oct 14 05:59:48 localhost systemd[300199]: Starting D-Bus User Message Bus Socket... Oct 14 05:59:48 localhost systemd[300199]: Starting Create User's Volatile Files and Directories... Oct 14 05:59:48 localhost systemd[300199]: Listening on D-Bus User Message Bus Socket. Oct 14 05:59:48 localhost systemd[300199]: Reached target Sockets. Oct 14 05:59:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 05:59:48 localhost systemd[300199]: Finished Create User's Volatile Files and Directories. Oct 14 05:59:48 localhost systemd[300199]: Reached target Basic System. Oct 14 05:59:48 localhost systemd[300199]: Reached target Main User Target. Oct 14 05:59:48 localhost systemd[300199]: Startup finished in 170ms. Oct 14 05:59:48 localhost systemd[1]: Started User Manager for UID 1003. Oct 14 05:59:48 localhost systemd[1]: Started Session 65 of User tripleo-admin. Oct 14 05:59:48 localhost podman[300215]: 2025-10-14 09:59:48.881504966 +0000 UTC m=+0.086935626 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 05:59:48 localhost podman[300215]: 2025-10-14 09:59:48.894996638 +0000 UTC m=+0.100427228 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 05:59:48 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 05:59:49 localhost python3[300360]: ansible-ansible.builtin.systemd Invoked with name=iptables state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:59:50 localhost python3[300505]: ansible-ansible.builtin.systemd Invoked with name=nftables state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Oct 14 05:59:50 localhost nova_compute[297686]: 2025-10-14 09:59:50.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:51 localhost systemd[1]: Stopping Netfilter Tables... Oct 14 05:59:51 localhost systemd[1]: nftables.service: Deactivated successfully. Oct 14 05:59:51 localhost systemd[1]: Stopped Netfilter Tables. Oct 14 05:59:52 localhost python3[300654]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/tripleo-rules.nft block=# 100 ceph_alertmanager {'dport': [9093]}#012add rule inet filter TRIPLEO_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard {'dport': [8443]}#012add rule inet filter TRIPLEO_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana {'dport': [3100]}#012add rule inet filter TRIPLEO_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus {'dport': [9092]}#012add rule inet filter TRIPLEO_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw {'dport': ['8080']}#012add rule inet filter TRIPLEO_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon {'dport': [6789, 3300, '9100']}#012add rule inet filter TRIPLEO_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds {'dport': ['6800-7300', '9100']}#012add rule inet filter TRIPLEO_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr {'dport': ['6800-7300', 8444]}#012add rule inet filter TRIPLEO_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs {'dport': ['12049', '2049']}#012add rule inet filter TRIPLEO_INPUT tcp dport { 2049 } ct state new counter accept comment "120 ceph_nfs"#012# 122 ceph rgw {'dport': ['8080', '8080', '9100']}#012add rule inet filter TRIPLEO_INPUT tcp dport { 8080,8080,9100 } ct state new counter accept comment "122 ceph rgw"#012# 123 ceph_dashboard {'dport': [3100, 9090, 9092, 9093, 9094, 9100, 9283]}#012add rule inet filter TRIPLEO_INPUT tcp dport { 3100,9090,9092,9093,9094,9100,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 05:59:53 localhost nova_compute[297686]: 2025-10-14 09:59:53.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:55 localhost nova_compute[297686]: 2025-10-14 09:59:55.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:59:57.768 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 05:59:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:59:57.768 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 05:59:57 localhost ovn_metadata_agent[163050]: 2025-10-14 09:59:57.770 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 05:59:58 localhost nova_compute[297686]: 2025-10-14 09:59:58.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 05:59:58 localhost podman[248187]: time="2025-10-14T09:59:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 05:59:58 localhost podman[248187]: @ - - [14/Oct/2025:09:59:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141158 "" "Go-http-client/1.1" Oct 14 05:59:58 localhost podman[248187]: @ - - [14/Oct/2025:09:59:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18362 "" "Go-http-client/1.1" Oct 14 06:00:00 localhost nova_compute[297686]: 2025-10-14 10:00:00.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:00:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:00:01 localhost podman[300849]: 2025-10-14 10:00:01.120040588 +0000 UTC m=+0.081197241 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:00:01 localhost podman[300849]: 2025-10-14 10:00:01.151957902 +0000 UTC m=+0.113114605 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:00:01 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:00:01 localhost podman[300848]: 2025-10-14 10:00:01.170827808 +0000 UTC m=+0.131382552 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:00:01 localhost podman[300848]: 2025-10-14 10:00:01.205023122 +0000 UTC m=+0.165577816 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:00:01 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:00:02 localhost podman[300970]: Oct 14 06:00:02 localhost podman[300970]: 2025-10-14 10:00:02.715098028 +0000 UTC m=+0.065835451 container create 3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_perlman, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-type=git, build-date=2025-09-24T08:57:55) Oct 14 06:00:02 localhost systemd[1]: Started libpod-conmon-3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11.scope. Oct 14 06:00:02 localhost systemd[1]: Started libcrun container. Oct 14 06:00:02 localhost podman[300970]: 2025-10-14 10:00:02.684267606 +0000 UTC m=+0.035005059 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:00:02 localhost podman[300970]: 2025-10-14 10:00:02.786969512 +0000 UTC m=+0.137706935 container init 3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_perlman, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:00:02 localhost podman[300970]: 2025-10-14 10:00:02.798104202 +0000 UTC m=+0.148841585 container start 3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_perlman, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux , architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph) Oct 14 06:00:02 localhost podman[300970]: 2025-10-14 10:00:02.798323699 +0000 UTC m=+0.149061122 container attach 3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_perlman, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.component=rhceph-container) Oct 14 06:00:02 localhost condescending_perlman[300985]: 167 167 Oct 14 06:00:02 localhost systemd[1]: libpod-3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11.scope: Deactivated successfully. Oct 14 06:00:02 localhost podman[300970]: 2025-10-14 10:00:02.806285852 +0000 UTC m=+0.157023295 container died 3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_perlman, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container) Oct 14 06:00:02 localhost podman[300990]: 2025-10-14 10:00:02.903040196 +0000 UTC m=+0.087010608 container remove 3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_perlman, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64) Oct 14 06:00:02 localhost systemd[1]: libpod-conmon-3582c6270214ea9af2738ca5c7bc9fe8599148fd27a99c9a9339fc826523bb11.scope: Deactivated successfully. Oct 14 06:00:02 localhost systemd[1]: Reloading. Oct 14 06:00:03 localhost systemd-sysv-generator[301035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:00:03 localhost systemd-rc-local-generator[301032]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:00:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:00:03 localhost nova_compute[297686]: 2025-10-14 10:00:03.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:03 localhost systemd[1]: var-lib-containers-storage-overlay-30c32db32c42c9a91f918cbaac3ff739296ce6eb069de2a52238ee20549c6502-merged.mount: Deactivated successfully. Oct 14 06:00:03 localhost systemd[1]: tmp-crun.CnnD1w.mount: Deactivated successfully. Oct 14 06:00:03 localhost systemd[1]: Reloading. Oct 14 06:00:03 localhost systemd-rc-local-generator[301073]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:00:03 localhost systemd-sysv-generator[301076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:00:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:00:03 localhost systemd[1]: Starting Ceph mds.mds.np0005486733.tvstmf for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 06:00:04 localhost podman[301137]: Oct 14 06:00:04 localhost podman[301137]: 2025-10-14 10:00:04.156610971 +0000 UTC m=+0.088356740 container create 59f0a590df7a9347440f4d5cd0636041ae1c771d0f1fc06b5ca731477fe75af3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mds-mds-np0005486733-tvstmf, build-date=2025-09-24T08:57:55, name=rhceph, version=7, RELEASE=main, release=553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git) Oct 14 06:00:04 localhost systemd[1]: tmp-crun.bhu3QM.mount: Deactivated successfully. Oct 14 06:00:04 localhost podman[301137]: 2025-10-14 10:00:04.11989418 +0000 UTC m=+0.051639999 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:00:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2632972b8bade7b0f16e98c69ba8d49ea91c28949b93958f438dc3cbbb3863a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 06:00:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2632972b8bade7b0f16e98c69ba8d49ea91c28949b93958f438dc3cbbb3863a1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:00:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2632972b8bade7b0f16e98c69ba8d49ea91c28949b93958f438dc3cbbb3863a1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 06:00:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2632972b8bade7b0f16e98c69ba8d49ea91c28949b93958f438dc3cbbb3863a1/merged/var/lib/ceph/mds/ceph-mds.np0005486733.tvstmf supports timestamps until 2038 (0x7fffffff) Oct 14 06:00:04 localhost podman[301137]: 2025-10-14 10:00:04.239192272 +0000 UTC m=+0.170938031 container init 59f0a590df7a9347440f4d5cd0636041ae1c771d0f1fc06b5ca731477fe75af3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mds-mds-np0005486733-tvstmf, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=553, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:00:04 localhost podman[301137]: 2025-10-14 10:00:04.248310821 +0000 UTC m=+0.180056580 container start 59f0a590df7a9347440f4d5cd0636041ae1c771d0f1fc06b5ca731477fe75af3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mds-mds-np0005486733-tvstmf, release=553, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, maintainer=Guillaume Abrioux , version=7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:00:04 localhost bash[301137]: 59f0a590df7a9347440f4d5cd0636041ae1c771d0f1fc06b5ca731477fe75af3 Oct 14 06:00:04 localhost systemd[1]: Started Ceph mds.mds.np0005486733.tvstmf for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 06:00:04 localhost ceph-mds[301155]: set uid:gid to 167:167 (ceph:ceph) Oct 14 06:00:04 localhost ceph-mds[301155]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2 Oct 14 06:00:04 localhost ceph-mds[301155]: main not setting numa affinity Oct 14 06:00:04 localhost ceph-mds[301155]: pidfile_write: ignore empty --pid-file Oct 14 06:00:04 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mds-mds-np0005486733-tvstmf[301151]: starting mds.mds.np0005486733.tvstmf at Oct 14 06:00:04 localhost ceph-mds[301155]: mds.mds.np0005486733.tvstmf Updating MDS map to version 6 from mon.0 Oct 14 06:00:05 localhost ceph-mds[301155]: mds.mds.np0005486733.tvstmf Updating MDS map to version 7 from mon.0 Oct 14 06:00:05 localhost ceph-mds[301155]: mds.mds.np0005486733.tvstmf Monitors have assigned me to become a standby. Oct 14 06:00:05 localhost nova_compute[297686]: 2025-10-14 10:00:05.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:08 localhost nova_compute[297686]: 2025-10-14 10:00:08.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:08 localhost openstack_network_exporter[250374]: ERROR 10:00:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:00:08 localhost openstack_network_exporter[250374]: ERROR 10:00:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:00:08 localhost openstack_network_exporter[250374]: Oct 14 06:00:08 localhost openstack_network_exporter[250374]: ERROR 10:00:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:00:08 localhost openstack_network_exporter[250374]: ERROR 10:00:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:00:08 localhost openstack_network_exporter[250374]: ERROR 10:00:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:00:08 localhost openstack_network_exporter[250374]: Oct 14 06:00:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:00:09 localhost podman[301192]: 2025-10-14 10:00:09.752463877 +0000 UTC m=+0.086799982 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:00:09 localhost podman[301192]: 2025-10-14 10:00:09.785876837 +0000 UTC m=+0.120212902 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:00:09 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:00:10 localhost podman[301318]: 2025-10-14 10:00:10.639099148 +0000 UTC m=+0.096694824 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:00:10 localhost podman[301318]: 2025-10-14 10:00:10.719569825 +0000 UTC m=+0.177165501 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=) Oct 14 06:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:00:10 localhost nova_compute[297686]: 2025-10-14 10:00:10.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:10 localhost podman[301365]: 2025-10-14 10:00:10.908532254 +0000 UTC m=+0.081903761 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:00:10 localhost podman[301365]: 2025-10-14 10:00:10.941512641 +0000 UTC m=+0.114884128 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:00:10 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:00:11 localhost systemd[1]: tmp-crun.XJx91B.mount: Deactivated successfully. Oct 14 06:00:11 localhost podman[301422]: 2025-10-14 10:00:11.767991756 +0000 UTC m=+0.100335385 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3) Oct 14 06:00:11 localhost podman[301422]: 2025-10-14 10:00:11.784222361 +0000 UTC m=+0.116566000 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:00:11 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:00:11 localhost systemd[1]: session-64.scope: Deactivated successfully. Oct 14 06:00:11 localhost systemd-logind[760]: Session 64 logged out. Waiting for processes to exit. Oct 14 06:00:11 localhost systemd-logind[760]: Removed session 64. Oct 14 06:00:13 localhost nova_compute[297686]: 2025-10-14 10:00:13.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:15 localhost nova_compute[297686]: 2025-10-14 10:00:15.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:18 localhost nova_compute[297686]: 2025-10-14 10:00:18.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:00:18 localhost podman[301477]: 2025-10-14 10:00:18.764875785 +0000 UTC m=+0.096467766 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal) Oct 14 06:00:18 localhost podman[301476]: 2025-10-14 10:00:18.740692557 +0000 UTC m=+0.078134716 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:00:18 localhost podman[301477]: 2025-10-14 10:00:18.805190727 +0000 UTC m=+0.136782638 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:00:18 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:00:18 localhost podman[301476]: 2025-10-14 10:00:18.826168697 +0000 UTC m=+0.163610806 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:00:18 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:00:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:00:19 localhost podman[301521]: 2025-10-14 10:00:19.738778501 +0000 UTC m=+0.082354205 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:00:19 localhost podman[301521]: 2025-10-14 10:00:19.752006435 +0000 UTC m=+0.095582159 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm) Oct 14 06:00:19 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:00:19 localhost nova_compute[297686]: 2025-10-14 10:00:19.769 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:19 localhost nova_compute[297686]: 2025-10-14 10:00:19.769 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:19 localhost nova_compute[297686]: 2025-10-14 10:00:19.795 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:19 localhost nova_compute[297686]: 2025-10-14 10:00:19.795 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:00:19 localhost nova_compute[297686]: 2025-10-14 10:00:19.796 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.187 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.187 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.187 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.188 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.601 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.626 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.627 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.628 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.628 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.628 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.629 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.629 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.630 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.630 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.630 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.657 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.658 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.658 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.658 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.659 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:00:20 localhost nova_compute[297686]: 2025-10-14 10:00:20.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.121 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.206 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.207 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.399 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.400 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11932MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.401 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.401 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.457 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.459 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.459 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.497 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.943 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:00:21 localhost nova_compute[297686]: 2025-10-14 10:00:21.949 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:00:22 localhost nova_compute[297686]: 2025-10-14 10:00:22.003 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:00:22 localhost nova_compute[297686]: 2025-10-14 10:00:22.005 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:00:22 localhost nova_compute[297686]: 2025-10-14 10:00:22.005 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:00:23 localhost nova_compute[297686]: 2025-10-14 10:00:23.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:25 localhost nova_compute[297686]: 2025-10-14 10:00:25.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:28 localhost podman[248187]: time="2025-10-14T10:00:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:00:28 localhost podman[248187]: @ - - [14/Oct/2025:10:00:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143236 "" "Go-http-client/1.1" Oct 14 06:00:28 localhost podman[248187]: @ - - [14/Oct/2025:10:00:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18858 "" "Go-http-client/1.1" Oct 14 06:00:28 localhost nova_compute[297686]: 2025-10-14 10:00:28.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:30 localhost nova_compute[297686]: 2025-10-14 10:00:30.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:30 localhost sshd[301584]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:00:31 localhost podman[301587]: 2025-10-14 10:00:31.515651886 +0000 UTC m=+0.083949284 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 06:00:31 localhost podman[301586]: 2025-10-14 10:00:31.571002036 +0000 UTC m=+0.144609446 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:00:31 localhost podman[301587]: 2025-10-14 10:00:31.596276558 +0000 UTC m=+0.164573956 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent) Oct 14 06:00:31 localhost podman[301586]: 2025-10-14 10:00:31.601847898 +0000 UTC m=+0.175455358 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:00:31 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:00:31 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:00:33 localhost nova_compute[297686]: 2025-10-14 10:00:33.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:35 localhost nova_compute[297686]: 2025-10-14 10:00:35.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:38 localhost nova_compute[297686]: 2025-10-14 10:00:38.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:38 localhost openstack_network_exporter[250374]: ERROR 10:00:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:00:38 localhost openstack_network_exporter[250374]: Oct 14 06:00:38 localhost openstack_network_exporter[250374]: ERROR 10:00:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:00:38 localhost openstack_network_exporter[250374]: ERROR 10:00:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:00:38 localhost openstack_network_exporter[250374]: ERROR 10:00:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:00:38 localhost openstack_network_exporter[250374]: ERROR 10:00:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:00:38 localhost openstack_network_exporter[250374]: Oct 14 06:00:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:00:40 localhost podman[301627]: 2025-10-14 10:00:40.399861821 +0000 UTC m=+0.072240706 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:00:40 localhost podman[301627]: 2025-10-14 10:00:40.412069554 +0000 UTC m=+0.084448429 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=iscsid, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Oct 14 06:00:40 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:00:40 localhost nova_compute[297686]: 2025-10-14 10:00:40.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:00:41 localhost systemd[1]: tmp-crun.XYimek.mount: Deactivated successfully. Oct 14 06:00:41 localhost podman[301648]: 2025-10-14 10:00:41.72135043 +0000 UTC m=+0.069111552 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:00:41 localhost podman[301648]: 2025-10-14 10:00:41.730153828 +0000 UTC m=+0.077914930 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:00:41 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:00:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:00:42 localhost podman[301671]: 2025-10-14 10:00:42.718701411 +0000 UTC m=+0.063782468 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:00:42 localhost podman[301671]: 2025-10-14 10:00:42.730990646 +0000 UTC m=+0.076071713 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd) Oct 14 06:00:42 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:00:43 localhost nova_compute[297686]: 2025-10-14 10:00:43.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:45 localhost nova_compute[297686]: 2025-10-14 10:00:45.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:48 localhost nova_compute[297686]: 2025-10-14 10:00:48.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:00:49 localhost systemd[1]: tmp-crun.7J9hja.mount: Deactivated successfully. Oct 14 06:00:49 localhost podman[301710]: 2025-10-14 10:00:49.743321339 +0000 UTC m=+0.080112957 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., release=1755695350, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:00:49 localhost podman[301710]: 2025-10-14 10:00:49.752623113 +0000 UTC m=+0.089414791 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 06:00:49 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:00:49 localhost podman[301709]: 2025-10-14 10:00:49.804380883 +0000 UTC m=+0.138482919 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.817 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.818 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.823 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dce8f170-14e8-4c98-9146-6472919e2a35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.818280', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a9801f5a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '3738689d5865ec4a85e86d3f7adc12dd5c79a2b28cbd40b29ed20f435a8eef2d'}]}, 'timestamp': '2025-10-14 10:00:49.824473', '_unique_id': '4bd1a5cefd6245a58aab55b2bbc75971'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.826 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.827 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.845 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.845 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11868d0e-e1e2-435e-b64e-b254d3ef49f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.827825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9836408-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.020536695, 'message_signature': 'b44949b7b5d428c6a2795442da59afabfea1b3723c19d4aaff949e7f8c2fee7e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.827825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9837c5e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.020536695, 'message_signature': 'ce3b66dad41128337148037d798c64e436cacb575feb88375d6c19b6cb00376f'}]}, 'timestamp': '2025-10-14 10:00:49.846415', '_unique_id': '876ce42b7dcf49a7b5320d222141329c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.847 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.849 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.849 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.850 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42b84603-9507-4081-9389-d9538b66e5cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.849617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9840d54-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.020536695, 'message_signature': 'ebb6ccfd61317670235189520f0061b4221f7dc07361ccd7942c26b69fa628cf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.849617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9841e20-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.020536695, 'message_signature': '320da78c2dc547a9b6028fc87b0dddf85c8907d76e5bb23064751fa1c535c5f7'}]}, 'timestamp': '2025-10-14 10:00:49.850534', '_unique_id': '0ee76ab4ec1d4b3484f0cb38b6502a3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.851 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.852 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:00:49 localhost podman[301709]: 2025-10-14 10:00:49.866161319 +0000 UTC m=+0.200263395 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd782fa1e-ce83-4469-b5d7-64753c5c64c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.852908', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a988ba34-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': 'b556a032408ce6473cb7116a1bb543087d75bc0585076c32f720c25208cec276'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.852908', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a988cede-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': 'f420f33ce2df3f0b9272f88ddd6056ef9ab5187c764ebd3fcfa4a9e0534d7cf8'}]}, 'timestamp': '2025-10-14 10:00:49.881282', '_unique_id': '9c7d3ba2535c40139c7ae1499fb5afdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.884 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '547fb5a9-8350-4fa2-bbe0-fec990c5f74d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.883997', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a9894b34-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '2a17cfa8b7a9dabbb6842a361dd68a295d6a34ab7146abc4a02ef84f09b84d95'}]}, 'timestamp': '2025-10-14 10:00:49.884517', '_unique_id': '3b656e3eeb424cff8f0137b065e91c08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.886 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.887 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bbd9d75-c778-4dcb-b71d-725529ed9ee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.886734', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a989b5ce-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': 'b8945799f79a6a2f424fedc97d7ec0a8d75ff731596a5caa19d773150180c8c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.886734', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a989c622-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': '60ad4f32398223fc10773b3f4051f10de4a7f23dc2b5eb5c65f0315528b26a85'}]}, 'timestamp': '2025-10-14 10:00:49.887596', '_unique_id': '97a832a3fd1e4e85a5238a22bdd6a51f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.889 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.890 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a222c77e-8f44-42fc-b62f-232082ac1e69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.889827', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a98a2e32-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': '3d6c39a2638fc81688996624abb3b1022e96967fa0fa2ea01c81db43db0354a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.889827', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a98a3e90-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': '4ef7274d5d84ed5eed60650d6a1cf038a7f97e064ffcba5971d327b1ecd03870'}]}, 'timestamp': '2025-10-14 10:00:49.890734', '_unique_id': '0696a2eb3e7240ef95ae0d6f63c6c654'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.892 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.893 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.893 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1806d921-97cb-4bb6-aed7-ace33ab67f71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.892974', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a98aa92a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': 'a2ee8d8c311db80aa8623841f058409f658c8c8030427600674dabd96f9e85c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.892974', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a98ab97e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': '928d3ca94ceafb59a5960187e99cc305a48b18b2bd76023b29bc9ae2d70e2034'}]}, 'timestamp': '2025-10-14 10:00:49.893854', '_unique_id': '2dd60953692447f7aabbcd287351ce6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.896 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bd739d4-f3cd-42fa-87aa-c4314483e199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.896126', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a98b2436-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': 'e6fae027c3c4107685af50dc41d7258671492e0e8de040fbd076cd0f844d2ed4'}]}, 'timestamp': '2025-10-14 10:00:49.897816', '_unique_id': '0a68acc154cb4efda9707df6c3b9b2f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.900 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 11870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3dbe4f1-3862-4b53-b68f-8910bc8ba352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11870000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:00:49.900184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a98e13f8-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.108027246, 'message_signature': '69dab8e8680cfc36b6088ac8d151ca6585d6c570ae671b3cfa955fb4326e1b66'}]}, 'timestamp': '2025-10-14 10:00:49.915781', '_unique_id': '5d3617bef7844715965c0ac1709a85e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb5fb641-7c77-41ec-96a9-78afa4c27976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.917291', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a98e5bd8-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '7c39cd592434f429ab6b0b10ce9932fbefb29acce15e8d12e8218a036ce7b0ca'}]}, 'timestamp': '2025-10-14 10:00:49.917585', '_unique_id': '7916975f516243d39b16cac6673af840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e58c9f4-73f4-416f-a134-1283e32f992f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.919050', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a98ea05c-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '049cafd629ed61c52f2760735d06c6538f715502022b15a310e4428ce2e9a9c2'}]}, 'timestamp': '2025-10-14 10:00:49.919341', '_unique_id': 'a5b094ac463a4880b74b96a059b949b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ecc58f2-dfba-47f5-b2c6-f04405e1c98c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.920742', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a98ee2ba-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': 'f7e2ba8016e7741322bd81e18324058dd41d7e8696351d0de34824457ab17283'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.920742', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a98eed3c-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': '680bfb9e2212d9f5bd56006d16b8021270490fca2b6491b6d9347d38fa80a8d1'}]}, 'timestamp': '2025-10-14 10:00:49.921289', '_unique_id': 'ebb5b24d89b14bcfa1651d0b7b079dd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd589275c-b3c0-4f58-941e-f8339faa1f8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.922725', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a98f30e4-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '24f56c4264742624dcb9bbedfdc21e505e22ebb84152639ba8e85fda588df559'}]}, 'timestamp': '2025-10-14 10:00:49.923044', '_unique_id': '140493ddd65242ce8c3daa952f795fd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a27c5d03-2c81-4538-90f6-d2447b4a7bdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.924402', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a98f71c6-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '022fad71c2aaf1064825a0fc479f1ae85ba248a434bb648183b26da096f84a8d'}]}, 'timestamp': '2025-10-14 10:00:49.924736', '_unique_id': '4d206c29a3994335acf6f231db464a18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost podman[301741]: 2025-10-14 10:00:49.873937627 +0000 UTC m=+0.089163714 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18798e0e-953a-4a88-9409-2bd7510b39ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.926156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a98fb618-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.020536695, 'message_signature': '4d52fc5831661efa7aa6c11593e7f8929b82c2f448038fa627d930a6565ceb4e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.926156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a98fc0cc-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.020536695, 'message_signature': '3d27fe8a1d9bc1716c452f60bb7e4f63e689294b7fd196bfa7290c88ac77da94'}]}, 'timestamp': '2025-10-14 10:00:49.926729', '_unique_id': '46942bbc87124015ad9094f6fff230c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f3db461-9abc-4eb2-a601-608232218b20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.928176', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a990080c-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '9104353f12c13cd979cf102b94a4b7c0f3ba7fec4d3fed12d9895987758ab302'}]}, 'timestamp': '2025-10-14 10:00:49.928581', '_unique_id': '1815d51084ef43dd9572630649adb638'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.930 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a8eebeb-6b14-4830-b30a-f93f313f7962', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:00:49.930206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a9905438-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.108027246, 'message_signature': '56ef23a60ae761e4a49f1bf539d913f7c88815b17f22e3fb5c0f25cec715f7e9'}]}, 'timestamp': '2025-10-14 10:00:49.930487', '_unique_id': '14b385a66c2d4b88a2be6f29c78d28f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.931 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de3485ac-2fa8-4cf4-afed-6709aee0ccf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:00:49.931823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9909420-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': '7fe62f60cd4c1c113422f1f7d223ddfd40725b4c9bdce354107946ca06e2e143'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:00:49.931823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9909f10-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.045645032, 'message_signature': 'e1295870ff3f2da9aba58ababfb9d5ddc82bd30c53986e92b84c4c44918d3f81'}]}, 'timestamp': '2025-10-14 10:00:49.932419', '_unique_id': 'c7b99e1670d14464ac83e30d0b1047cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feb5468f-c397-4175-9e0b-298ee5f049a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.933799', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a990e0b0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': 'bfeb8cc6a9edb495405085ec0cdc5301f8f0df05a48b2a513d7b4c688a065bd4'}]}, 'timestamp': '2025-10-14 10:00:49.934093', '_unique_id': '412f851b11cb416b95d5d4f1dffb3988'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.935 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a45a7549-e67b-414d-8db4-9149a0d1f356', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:00:49.935404', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'a9911f3a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12066.010986213, 'message_signature': '5cc142120d46d8a68b94279b2013c1630044138d079d75177853a8de5eb24050'}]}, 'timestamp': '2025-10-14 10:00:49.935714', '_unique_id': 'a359ea5352e64b74b795112f0622e8ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:00:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:00:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 06:00:49 localhost podman[301741]: 2025-10-14 10:00:49.955881589 +0000 UTC m=+0.171107726 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:00:49 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:00:50 localhost nova_compute[297686]: 2025-10-14 10:00:50.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:51 localhost systemd[1]: session-65.scope: Deactivated successfully. Oct 14 06:00:51 localhost systemd[1]: session-65.scope: Consumed 1.888s CPU time. Oct 14 06:00:51 localhost systemd-logind[760]: Session 65 logged out. Waiting for processes to exit. Oct 14 06:00:51 localhost systemd-logind[760]: Removed session 65. Oct 14 06:00:53 localhost nova_compute[297686]: 2025-10-14 10:00:53.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:55 localhost nova_compute[297686]: 2025-10-14 10:00:55.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:00:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:00:57.769 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:00:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:00:57.769 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:00:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:00:57.771 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:00:58 localhost podman[248187]: time="2025-10-14T10:00:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:00:58 localhost podman[248187]: @ - - [14/Oct/2025:10:00:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143236 "" "Go-http-client/1.1" Oct 14 06:00:58 localhost podman[248187]: @ - - [14/Oct/2025:10:00:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18850 "" "Go-http-client/1.1" Oct 14 06:00:58 localhost nova_compute[297686]: 2025-10-14 10:00:58.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:01 localhost nova_compute[297686]: 2025-10-14 10:01:01.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:01:01 localhost systemd[1]: Stopping User Manager for UID 1003... Oct 14 06:01:01 localhost systemd[300199]: Activating special unit Exit the Session... Oct 14 06:01:01 localhost systemd[300199]: Stopped target Main User Target. Oct 14 06:01:01 localhost systemd[300199]: Stopped target Basic System. Oct 14 06:01:01 localhost systemd[300199]: Stopped target Paths. Oct 14 06:01:01 localhost systemd[300199]: Stopped target Sockets. Oct 14 06:01:01 localhost systemd[300199]: Stopped target Timers. Oct 14 06:01:01 localhost systemd[300199]: Stopped Mark boot as successful after the user session has run 2 minutes. Oct 14 06:01:01 localhost systemd[300199]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 06:01:01 localhost systemd[300199]: Closed D-Bus User Message Bus Socket. Oct 14 06:01:01 localhost systemd[300199]: Stopped Create User's Volatile Files and Directories. Oct 14 06:01:01 localhost systemd[300199]: Removed slice User Application Slice. Oct 14 06:01:01 localhost systemd[300199]: Reached target Shutdown. Oct 14 06:01:01 localhost systemd[300199]: Finished Exit the Session. Oct 14 06:01:01 localhost systemd[300199]: Reached target Exit the Session. Oct 14 06:01:01 localhost systemd[1]: user@1003.service: Deactivated successfully. Oct 14 06:01:01 localhost systemd[1]: Stopped User Manager for UID 1003. Oct 14 06:01:01 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Oct 14 06:01:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:01:01 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Oct 14 06:01:01 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Oct 14 06:01:01 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Oct 14 06:01:01 localhost systemd[1]: Removed slice User Slice of UID 1003. Oct 14 06:01:01 localhost systemd[1]: user-1003.slice: Consumed 2.302s CPU time. Oct 14 06:01:01 localhost podman[301911]: 2025-10-14 10:01:01.753213418 +0000 UTC m=+0.058901689 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:01:01 localhost podman[301911]: 2025-10-14 10:01:01.786265137 +0000 UTC m=+0.091953428 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:01:01 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:01:01 localhost podman[301904]: 2025-10-14 10:01:01.73756273 +0000 UTC m=+0.077061683 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:01:01 localhost podman[301904]: 2025-10-14 10:01:01.867978192 +0000 UTC m=+0.207477195 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:01:01 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:01:03 localhost nova_compute[297686]: 2025-10-14 10:01:03.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:06 localhost nova_compute[297686]: 2025-10-14 10:01:06.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:08 localhost nova_compute[297686]: 2025-10-14 10:01:08.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:08 localhost openstack_network_exporter[250374]: ERROR 10:01:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:01:08 localhost openstack_network_exporter[250374]: ERROR 10:01:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:01:08 localhost openstack_network_exporter[250374]: ERROR 10:01:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:01:08 localhost openstack_network_exporter[250374]: ERROR 10:01:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:01:08 localhost openstack_network_exporter[250374]: Oct 14 06:01:08 localhost openstack_network_exporter[250374]: ERROR 10:01:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:01:08 localhost openstack_network_exporter[250374]: Oct 14 06:01:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:01:10 localhost podman[301943]: 2025-10-14 10:01:10.737913458 +0000 UTC m=+0.079014563 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:01:10 localhost podman[301943]: 2025-10-14 10:01:10.751197544 +0000 UTC m=+0.092298629 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true) Oct 14 06:01:10 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:01:11 localhost nova_compute[297686]: 2025-10-14 10:01:11.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.275 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.276 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.276 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 06:01:12 localhost nova_compute[297686]: 2025-10-14 10:01:12.291 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:01:12 localhost systemd[1]: tmp-crun.2WrdkG.mount: Deactivated successfully. Oct 14 06:01:12 localhost podman[301962]: 2025-10-14 10:01:12.745873887 +0000 UTC m=+0.086682917 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:01:12 localhost podman[301962]: 2025-10-14 10:01:12.766860077 +0000 UTC m=+0.107669137 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:01:12 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:01:12 localhost podman[301985]: 2025-10-14 10:01:12.853925775 +0000 UTC m=+0.079313102 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible) Oct 14 06:01:12 localhost podman[301985]: 2025-10-14 10:01:12.869448109 +0000 UTC m=+0.094835376 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:01:12 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:01:13 localhost nova_compute[297686]: 2025-10-14 10:01:13.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:14 localhost nova_compute[297686]: 2025-10-14 10:01:14.303 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:16 localhost nova_compute[297686]: 2025-10-14 10:01:16.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:16 localhost nova_compute[297686]: 2025-10-14 10:01:16.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:16 localhost nova_compute[297686]: 2025-10-14 10:01:16.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:16 localhost nova_compute[297686]: 2025-10-14 10:01:16.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:16 localhost nova_compute[297686]: 2025-10-14 10:01:16.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:16 localhost nova_compute[297686]: 2025-10-14 10:01:16.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.276 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.277 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.277 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.277 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.278 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.755 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.840 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:01:17 localhost nova_compute[297686]: 2025-10-14 10:01:17.841 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.053 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.055 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11931MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.055 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.056 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.447 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.447 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.448 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.665 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.885 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.886 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.903 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.923 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 06:01:18 localhost nova_compute[297686]: 2025-10-14 10:01:18.985 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:01:19 localhost nova_compute[297686]: 2025-10-14 10:01:19.459 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:01:19 localhost nova_compute[297686]: 2025-10-14 10:01:19.466 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:01:19 localhost nova_compute[297686]: 2025-10-14 10:01:19.480 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:01:19 localhost nova_compute[297686]: 2025-10-14 10:01:19.482 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:01:19 localhost nova_compute[297686]: 2025-10-14 10:01:19.483 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.427s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:01:20 localhost nova_compute[297686]: 2025-10-14 10:01:20.483 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:20 localhost nova_compute[297686]: 2025-10-14 10:01:20.483 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:01:20 localhost nova_compute[297686]: 2025-10-14 10:01:20.484 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:01:20 localhost systemd[1]: tmp-crun.q7bRDL.mount: Deactivated successfully. Oct 14 06:01:20 localhost podman[302048]: 2025-10-14 10:01:20.760936649 +0000 UTC m=+0.097329122 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3) Oct 14 06:01:20 localhost podman[302049]: 2025-10-14 10:01:20.873756433 +0000 UTC m=+0.206429783 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter) Oct 14 06:01:20 localhost podman[302049]: 2025-10-14 10:01:20.88608672 +0000 UTC m=+0.218760010 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container) Oct 14 06:01:20 localhost podman[302050]: 2025-10-14 10:01:20.847290095 +0000 UTC m=+0.178014045 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:01:20 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:01:20 localhost podman[302050]: 2025-10-14 10:01:20.930371641 +0000 UTC m=+0.261095511 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:01:20 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:01:20 localhost podman[302048]: 2025-10-14 10:01:20.987253478 +0000 UTC m=+0.323645991 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:01:20 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:01:21 localhost nova_compute[297686]: 2025-10-14 10:01:21.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:21 localhost nova_compute[297686]: 2025-10-14 10:01:21.405 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:01:21 localhost nova_compute[297686]: 2025-10-14 10:01:21.406 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:01:21 localhost nova_compute[297686]: 2025-10-14 10:01:21.406 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:01:21 localhost nova_compute[297686]: 2025-10-14 10:01:21.407 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:01:22 localhost nova_compute[297686]: 2025-10-14 10:01:22.558 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:01:22 localhost nova_compute[297686]: 2025-10-14 10:01:22.573 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:01:22 localhost nova_compute[297686]: 2025-10-14 10:01:22.574 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:01:22 localhost nova_compute[297686]: 2025-10-14 10:01:22.576 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:01:23 localhost nova_compute[297686]: 2025-10-14 10:01:23.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:26 localhost nova_compute[297686]: 2025-10-14 10:01:26.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:28 localhost podman[248187]: time="2025-10-14T10:01:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:01:28 localhost podman[248187]: @ - - [14/Oct/2025:10:01:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143236 "" "Go-http-client/1.1" Oct 14 06:01:28 localhost podman[248187]: @ - - [14/Oct/2025:10:01:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18856 "" "Go-http-client/1.1" Oct 14 06:01:28 localhost nova_compute[297686]: 2025-10-14 10:01:28.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:31 localhost nova_compute[297686]: 2025-10-14 10:01:31.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:01:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:01:32 localhost podman[302168]: 2025-10-14 10:01:32.712499651 +0000 UTC m=+0.055600739 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:01:32 localhost podman[302169]: 2025-10-14 10:01:32.760719843 +0000 UTC m=+0.104051118 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:01:32 localhost podman[302169]: 2025-10-14 10:01:32.792085511 +0000 UTC m=+0.135416776 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:01:32 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:01:32 localhost podman[302168]: 2025-10-14 10:01:32.845980756 +0000 UTC m=+0.189081814 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:01:32 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:01:33 localhost nova_compute[297686]: 2025-10-14 10:01:33.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:34 localhost podman[302287]: Oct 14 06:01:34 localhost podman[302287]: 2025-10-14 10:01:34.943176328 +0000 UTC m=+0.063629303 container create 0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_elion, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main) Oct 14 06:01:34 localhost systemd[1]: Started libpod-conmon-0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c.scope. Oct 14 06:01:35 localhost podman[302287]: 2025-10-14 10:01:34.912286785 +0000 UTC m=+0.032739790 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:01:35 localhost systemd[1]: Started libcrun container. Oct 14 06:01:35 localhost podman[302287]: 2025-10-14 10:01:35.037142227 +0000 UTC m=+0.157595192 container init 0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_elion, release=553, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, name=rhceph) Oct 14 06:01:35 localhost podman[302287]: 2025-10-14 10:01:35.045491342 +0000 UTC m=+0.165944337 container start 0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_elion, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:01:35 localhost podman[302287]: 2025-10-14 10:01:35.046740059 +0000 UTC m=+0.167193084 container attach 0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_elion, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, release=553) Oct 14 06:01:35 localhost cool_elion[302302]: 167 167 Oct 14 06:01:35 localhost systemd[1]: libpod-0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c.scope: Deactivated successfully. Oct 14 06:01:35 localhost podman[302287]: 2025-10-14 10:01:35.052467334 +0000 UTC m=+0.172920319 container died 0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_elion, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, CEPH_POINT_RELEASE=, version=7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:01:35 localhost podman[302307]: 2025-10-14 10:01:35.129226028 +0000 UTC m=+0.072127263 container remove 0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_elion, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, version=7, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=) Oct 14 06:01:35 localhost systemd[1]: libpod-conmon-0522b5d15bccfb4399805e75e7f280e15b7fab260bf9145fa8e10e057905cd3c.scope: Deactivated successfully. Oct 14 06:01:35 localhost systemd[1]: Reloading. Oct 14 06:01:35 localhost systemd-rc-local-generator[302346]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:01:35 localhost systemd-sysv-generator[302350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:01:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:01:35 localhost systemd[1]: var-lib-containers-storage-overlay-53c8c68413bf038ec41630b245bfbc1ba9dfb55dd4af8f02bf01dd3e9820cf37-merged.mount: Deactivated successfully. Oct 14 06:01:35 localhost systemd[1]: Reloading. Oct 14 06:01:35 localhost systemd-sysv-generator[302397]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:01:35 localhost systemd-rc-local-generator[302394]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:01:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:01:35 localhost systemd[1]: Starting Ceph mgr.np0005486733.primvu for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 06:01:36 localhost nova_compute[297686]: 2025-10-14 10:01:36.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:36 localhost podman[302453]: Oct 14 06:01:36 localhost podman[302453]: 2025-10-14 10:01:36.24566623 +0000 UTC m=+0.076662301 container create e17b402d5e21710846cee92a390888733413c30560bd4ea5453156f28b0d4d3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:01:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/974a69176d9c755e3d8cc8e9d3a1bca7a7f6f79c5f15d9677e9cfc14a7ebfbee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/974a69176d9c755e3d8cc8e9d3a1bca7a7f6f79c5f15d9677e9cfc14a7ebfbee/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/974a69176d9c755e3d8cc8e9d3a1bca7a7f6f79c5f15d9677e9cfc14a7ebfbee/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/974a69176d9c755e3d8cc8e9d3a1bca7a7f6f79c5f15d9677e9cfc14a7ebfbee/merged/var/lib/ceph/mgr/ceph-np0005486733.primvu supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:36 localhost podman[302453]: 2025-10-14 10:01:36.306791136 +0000 UTC m=+0.137787207 container init e17b402d5e21710846cee92a390888733413c30560bd4ea5453156f28b0d4d3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph) Oct 14 06:01:36 localhost podman[302453]: 2025-10-14 10:01:36.31283615 +0000 UTC m=+0.143832181 container start e17b402d5e21710846cee92a390888733413c30560bd4ea5453156f28b0d4d3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, ceph=True, architecture=x86_64, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Oct 14 06:01:36 localhost bash[302453]: e17b402d5e21710846cee92a390888733413c30560bd4ea5453156f28b0d4d3b Oct 14 06:01:36 localhost podman[302453]: 2025-10-14 10:01:36.218187862 +0000 UTC m=+0.049183923 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:01:36 localhost systemd[1]: Started Ceph mgr.np0005486733.primvu for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 06:01:36 localhost ceph-mgr[302471]: set uid:gid to 167:167 (ceph:ceph) Oct 14 06:01:36 localhost ceph-mgr[302471]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Oct 14 06:01:36 localhost ceph-mgr[302471]: pidfile_write: ignore empty --pid-file Oct 14 06:01:36 localhost ceph-mgr[302471]: mgr[py] Loading python module 'alerts' Oct 14 06:01:36 localhost ceph-mgr[302471]: mgr[py] Module alerts has missing NOTIFY_TYPES member Oct 14 06:01:36 localhost ceph-mgr[302471]: mgr[py] Loading python module 'balancer' Oct 14 06:01:36 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:36.507+0000 7f148dbdd140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Oct 14 06:01:36 localhost ceph-mgr[302471]: mgr[py] Module balancer has missing NOTIFY_TYPES member Oct 14 06:01:36 localhost ceph-mgr[302471]: mgr[py] Loading python module 'cephadm' Oct 14 06:01:36 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:36.576+0000 7f148dbdd140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Loading python module 'crash' Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Module crash has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Loading python module 'dashboard' Oct 14 06:01:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:37.229+0000 7f148dbdd140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Loading python module 'devicehealth' Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Loading python module 'diskprediction_local' Oct 14 06:01:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:37.797+0000 7f148dbdd140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost systemd[1]: tmp-crun.ipDyd3.mount: Deactivated successfully. Oct 14 06:01:37 localhost podman[302630]: 2025-10-14 10:01:37.900852979 +0000 UTC m=+0.104329326 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, GIT_CLEAN=True, vcs-type=git) Oct 14 06:01:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Oct 14 06:01:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Oct 14 06:01:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: from numpy import show_config as show_numpy_config Oct 14 06:01:37 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:37.939+0000 7f148dbdd140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Oct 14 06:01:37 localhost ceph-mgr[302471]: mgr[py] Loading python module 'influx' Oct 14 06:01:38 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:37.999+0000 7f148dbdd140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Module influx has missing NOTIFY_TYPES member Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'insights' Oct 14 06:01:38 localhost podman[302630]: 2025-10-14 10:01:38.008917128 +0000 UTC m=+0.212393505 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, release=553, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'iostat' Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Module iostat has missing NOTIFY_TYPES member Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'k8sevents' Oct 14 06:01:38 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:38.118+0000 7f148dbdd140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'localpool' Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'mds_autoscaler' Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'mirroring' Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'nfs' Oct 14 06:01:38 localhost openstack_network_exporter[250374]: ERROR 10:01:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:01:38 localhost openstack_network_exporter[250374]: ERROR 10:01:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:01:38 localhost openstack_network_exporter[250374]: ERROR 10:01:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:01:38 localhost openstack_network_exporter[250374]: ERROR 10:01:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:01:38 localhost openstack_network_exporter[250374]: Oct 14 06:01:38 localhost openstack_network_exporter[250374]: ERROR 10:01:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:01:38 localhost openstack_network_exporter[250374]: Oct 14 06:01:38 localhost nova_compute[297686]: 2025-10-14 10:01:38.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Module nfs has missing NOTIFY_TYPES member Oct 14 06:01:38 localhost ceph-mgr[302471]: mgr[py] Loading python module 'orchestrator' Oct 14 06:01:38 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:38.916+0000 7f148dbdd140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'osd_perf_query' Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.081+0000 7f148dbdd140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'osd_support' Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.150+0000 7f148dbdd140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'pg_autoscaler' Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.208+0000 7f148dbdd140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.280+0000 7f148dbdd140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'progress' Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module progress has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'prometheus' Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.345+0000 7f148dbdd140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'rbd_support' Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.676+0000 7f148dbdd140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'restful' Oct 14 06:01:39 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:39.760+0000 7f148dbdd140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Oct 14 06:01:39 localhost ceph-mgr[302471]: mgr[py] Loading python module 'rgw' Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Module rgw has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:40.124+0000 7f148dbdd140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'rook' Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Module rook has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'selftest' Oct 14 06:01:40 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:40.620+0000 7f148dbdd140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Module selftest has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:40.686+0000 7f148dbdd140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'snap_schedule' Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'stats' Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'status' Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Module status has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'telegraf' Oct 14 06:01:40 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:40.887+0000 7f148dbdd140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Oct 14 06:01:40 localhost ceph-mgr[302471]: mgr[py] Loading python module 'telemetry' Oct 14 06:01:40 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:40.949+0000 7f148dbdd140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Loading python module 'test_orchestrator' Oct 14 06:01:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:41.087+0000 7f148dbdd140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost nova_compute[297686]: 2025-10-14 10:01:41.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Loading python module 'volumes' Oct 14 06:01:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:41.240+0000 7f148dbdd140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Module volumes has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Loading python module 'zabbix' Oct 14 06:01:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:41.429+0000 7f148dbdd140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:01:41.491+0000 7f148dbdd140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Oct 14 06:01:41 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97600 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Oct 14 06:01:41 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3165030492 Oct 14 06:01:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:01:41 localhost podman[302786]: 2025-10-14 10:01:41.740456573 +0000 UTC m=+0.080837399 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:01:41 localhost podman[302786]: 2025-10-14 10:01:41.75804463 +0000 UTC m=+0.098425396 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2) Oct 14 06:01:41 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:01:42 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3165030492 Oct 14 06:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:01:43 localhost podman[302805]: 2025-10-14 10:01:43.717367412 +0000 UTC m=+0.065493959 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:01:43 localhost podman[302805]: 2025-10-14 10:01:43.727880774 +0000 UTC m=+0.076007251 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:01:43 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:01:43 localhost podman[302806]: 2025-10-14 10:01:43.770252307 +0000 UTC m=+0.116751874 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:01:43 localhost podman[302806]: 2025-10-14 10:01:43.774653382 +0000 UTC m=+0.121152939 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:01:43 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:01:43 localhost nova_compute[297686]: 2025-10-14 10:01:43.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:46 localhost nova_compute[297686]: 2025-10-14 10:01:46.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:48 localhost nova_compute[297686]: 2025-10-14 10:01:48.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:01:51 localhost podman[303575]: 2025-10-14 10:01:51.179504244 +0000 UTC m=+0.095089294 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 06:01:51 localhost podman[303577]: 2025-10-14 10:01:51.236777533 +0000 UTC m=+0.144687809 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Oct 14 06:01:51 localhost podman[303575]: 2025-10-14 10:01:51.241058993 +0000 UTC m=+0.156644023 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 06:01:51 localhost podman[303577]: 2025-10-14 10:01:51.24815334 +0000 UTC m=+0.156063596 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Oct 14 06:01:51 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:01:51 localhost nova_compute[297686]: 2025-10-14 10:01:51.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:51 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:01:51 localhost podman[303576]: 2025-10-14 10:01:51.344625374 +0000 UTC m=+0.257495511 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter) Oct 14 06:01:51 localhost podman[303576]: 2025-10-14 10:01:51.355812476 +0000 UTC m=+0.268682633 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, config_id=edpm, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41) Oct 14 06:01:51 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:01:51 localhost podman[303658]: Oct 14 06:01:51 localhost podman[303658]: 2025-10-14 10:01:51.428588648 +0000 UTC m=+0.121777708 container create 445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_joliot, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, release=553, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git) Oct 14 06:01:51 localhost podman[303658]: 2025-10-14 10:01:51.342331574 +0000 UTC m=+0.035520634 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:01:51 localhost systemd[1]: Started libpod-conmon-445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553.scope. Oct 14 06:01:51 localhost systemd[1]: Started libcrun container. Oct 14 06:01:51 localhost podman[303658]: 2025-10-14 10:01:51.490216759 +0000 UTC m=+0.183405829 container init 445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_joliot, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, architecture=x86_64) Oct 14 06:01:51 localhost podman[303658]: 2025-10-14 10:01:51.499649577 +0000 UTC m=+0.192838627 container start 445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_joliot, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7) Oct 14 06:01:51 localhost podman[303658]: 2025-10-14 10:01:51.499860744 +0000 UTC m=+0.193049844 container attach 445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_joliot, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, release=553, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:01:51 localhost heuristic_joliot[303680]: 167 167 Oct 14 06:01:51 localhost systemd[1]: libpod-445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553.scope: Deactivated successfully. Oct 14 06:01:51 localhost podman[303658]: 2025-10-14 10:01:51.5069425 +0000 UTC m=+0.200131550 container died 445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_joliot, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , release=553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Oct 14 06:01:51 localhost podman[303685]: 2025-10-14 10:01:51.615477903 +0000 UTC m=+0.100057786 container remove 445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_joliot, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, release=553, GIT_CLEAN=True, vcs-type=git) Oct 14 06:01:51 localhost systemd[1]: libpod-conmon-445fced27f5334a596c8588b0e7fa5c5f8ab06267585e1e6c8a4723fd22e5553.scope: Deactivated successfully. Oct 14 06:01:51 localhost podman[303701]: Oct 14 06:01:51 localhost podman[303701]: 2025-10-14 10:01:51.732044531 +0000 UTC m=+0.079761835 container create 41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_satoshi, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:01:51 localhost systemd[1]: Started libpod-conmon-41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73.scope. Oct 14 06:01:51 localhost systemd[1]: Started libcrun container. Oct 14 06:01:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c549c3d93ee3de03eb58b6abf5687fde9038bb2f9fdfa137ff38eaf58be52ab/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c549c3d93ee3de03eb58b6abf5687fde9038bb2f9fdfa137ff38eaf58be52ab/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c549c3d93ee3de03eb58b6abf5687fde9038bb2f9fdfa137ff38eaf58be52ab/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c549c3d93ee3de03eb58b6abf5687fde9038bb2f9fdfa137ff38eaf58be52ab/merged/var/lib/ceph/mon/ceph-np0005486733 supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:51 localhost podman[303701]: 2025-10-14 10:01:51.798739327 +0000 UTC m=+0.146456621 container init 41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_satoshi, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:01:51 localhost podman[303701]: 2025-10-14 10:01:51.700376285 +0000 UTC m=+0.048093599 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:01:51 localhost podman[303701]: 2025-10-14 10:01:51.808947019 +0000 UTC m=+0.156664313 container start 41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_satoshi, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux , release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, ceph=True, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph) Oct 14 06:01:51 localhost podman[303701]: 2025-10-14 10:01:51.809168416 +0000 UTC m=+0.156885710 container attach 41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_satoshi, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:01:51 localhost systemd[1]: libpod-41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73.scope: Deactivated successfully. Oct 14 06:01:51 localhost podman[303701]: 2025-10-14 10:01:51.895135651 +0000 UTC m=+0.242852945 container died 41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_satoshi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7) Oct 14 06:01:51 localhost podman[303743]: 2025-10-14 10:01:51.982644902 +0000 UTC m=+0.078869029 container remove 41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_satoshi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:01:51 localhost systemd[1]: libpod-conmon-41e0e823b399c1e7da1edd9dac5622d0820fde97f9008285882516b42a8aec73.scope: Deactivated successfully. Oct 14 06:01:52 localhost systemd[1]: Reloading. Oct 14 06:01:52 localhost systemd-rc-local-generator[303779]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:01:52 localhost systemd-sysv-generator[303782]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:01:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:01:52 localhost systemd[1]: Reloading. Oct 14 06:01:52 localhost systemd-rc-local-generator[303824]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:01:52 localhost systemd-sysv-generator[303829]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:01:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:01:52 localhost systemd[1]: Starting Ceph mon.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 06:01:53 localhost podman[303888]: Oct 14 06:01:53 localhost podman[303888]: 2025-10-14 10:01:53.183863252 +0000 UTC m=+0.080632832 container create 294d8462825af3565a6306a21ed744f62ac8682a5af619c3ad636b16a763a986 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, vcs-type=git, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main) Oct 14 06:01:53 localhost systemd[1]: tmp-crun.R37CB7.mount: Deactivated successfully. Oct 14 06:01:53 localhost podman[303888]: 2025-10-14 10:01:53.152931148 +0000 UTC m=+0.049700758 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:01:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/794e3357b9201828e8ded51a8263ea3f0ad4db588fac4a39a3be526ce9d784ee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/794e3357b9201828e8ded51a8263ea3f0ad4db588fac4a39a3be526ce9d784ee/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/794e3357b9201828e8ded51a8263ea3f0ad4db588fac4a39a3be526ce9d784ee/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/794e3357b9201828e8ded51a8263ea3f0ad4db588fac4a39a3be526ce9d784ee/merged/var/lib/ceph/mon/ceph-np0005486733 supports timestamps until 2038 (0x7fffffff) Oct 14 06:01:53 localhost podman[303888]: 2025-10-14 10:01:53.269596299 +0000 UTC m=+0.166365879 container init 294d8462825af3565a6306a21ed744f62ac8682a5af619c3ad636b16a763a986 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc.) Oct 14 06:01:53 localhost podman[303888]: 2025-10-14 10:01:53.284983899 +0000 UTC m=+0.181753449 container start 294d8462825af3565a6306a21ed744f62ac8682a5af619c3ad636b16a763a986 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git) Oct 14 06:01:53 localhost bash[303888]: 294d8462825af3565a6306a21ed744f62ac8682a5af619c3ad636b16a763a986 Oct 14 06:01:53 localhost systemd[1]: Started Ceph mon.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 06:01:53 localhost ceph-mon[303906]: set uid:gid to 167:167 (ceph:ceph) Oct 14 06:01:53 localhost ceph-mon[303906]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Oct 14 06:01:53 localhost ceph-mon[303906]: pidfile_write: ignore empty --pid-file Oct 14 06:01:53 localhost ceph-mon[303906]: load: jerasure load: lrc Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: RocksDB version: 7.9.2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Git sha 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Compile date 2025-09-23 00:00:00 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: DB SUMMARY Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: DB Session ID: ABXPLKSCGGN31DHJT8DW Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: CURRENT file: CURRENT Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: IDENTITY file: IDENTITY Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005486733/store.db dir, Total Num: 0, files: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005486733/store.db: 000004.log size: 761 ; Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.error_if_exists: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.create_if_missing: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.paranoid_checks: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.flush_verify_memtable_count: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.env: 0x5581e33149e0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.fs: PosixFileSystem Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.info_log: 0x5581e5250d20 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_file_opening_threads: 16 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.statistics: (nil) Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.use_fsync: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_log_file_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_manifest_file_size: 1073741824 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.log_file_time_to_roll: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.keep_log_file_num: 1000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.recycle_log_file_num: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.allow_fallocate: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.allow_mmap_reads: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.allow_mmap_writes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.use_direct_reads: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.create_missing_column_families: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.db_log_dir: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.wal_dir: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.table_cache_numshardbits: 6 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.WAL_ttl_seconds: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.WAL_size_limit_MB: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.manifest_preallocation_size: 4194304 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.is_fd_close_on_exec: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.advise_random_on_open: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.db_write_buffer_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.write_buffer_manager: 0x5581e5261540 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.access_hint_on_compaction_start: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.random_access_max_buffer_size: 1048576 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.use_adaptive_mutex: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.rate_limiter: (nil) Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.wal_recovery_mode: 2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.enable_thread_tracking: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.enable_pipelined_write: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.unordered_write: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.allow_concurrent_memtable_write: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.write_thread_max_yield_usec: 100 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.write_thread_slow_yield_usec: 3 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.row_cache: None Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.wal_filter: None Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.avoid_flush_during_recovery: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.allow_ingest_behind: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.two_write_queues: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.manual_wal_flush: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.wal_compression: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.atomic_flush: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.persist_stats_to_disk: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.write_dbid_to_manifest: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.log_readahead_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.file_checksum_gen_factory: Unknown Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.best_efforts_recovery: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.allow_data_in_errors: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.db_host_id: __hostname__ Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.enforce_single_del_contracts: true Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_background_jobs: 2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_background_compactions: -1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_subcompactions: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.avoid_flush_during_shutdown: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.delayed_write_rate : 16777216 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_total_wal_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.stats_dump_period_sec: 600 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.stats_persist_period_sec: 600 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.stats_history_buffer_size: 1048576 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_open_files: -1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bytes_per_sync: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.wal_bytes_per_sync: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.strict_bytes_per_sync: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_readahead_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_background_flushes: -1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Compression algorithms supported: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kZSTD supported: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kXpressCompression supported: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kBZip2Compression supported: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kLZ4Compression supported: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kZlibCompression supported: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kLZ4HCCompression supported: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: #011kSnappyCompression supported: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Fast CRC32 supported: Supported on x86 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: DMutex implementation: pthread_mutex_t Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005486733/store.db/MANIFEST-000005 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.merge_operator: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_filter: None Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_filter_factory: None Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.sst_partitioner_factory: None Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581e5250980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5581e524d350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.write_buffer_size: 33554432 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_write_buffer_number: 2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression: NoCompression Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression: Disabled Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.prefix_extractor: nullptr Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.num_levels: 7 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.level: 32767 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.enabled: false Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_base: 268435456 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.arena_block_size: 1048576 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.table_properties_collectors: Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.inplace_update_support: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.bloom_locality: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.max_successive_merges: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.force_consistency_checks: 1 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.ttl: 2592000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.enable_blob_files: false Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.min_blob_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.blob_file_size: 268435456 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005486733/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 84a9ba57-7643-4e19-a68b-d3c5f7942ded Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436113344838, "job": 1, "event": "recovery_started", "wal_files": [4]} Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436113347479, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436113347595, "job": 1, "event": "recovery_finished"} Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5581e5274e00 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: DB pointer 0x5581e536a000 Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:01:53 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.84 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.84 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5581e524d350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733 does not exist in monmap, will attempt to join an existing cluster Oct 14 06:01:53 localhost ceph-mon[303906]: using public_addr v2:172.18.0.108:0/0 -> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] Oct 14 06:01:53 localhost ceph-mon[303906]: starting mon.np0005486733 rank -1 at public addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] at bind addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005486733 fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(???) e0 preinit fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing) e3 sync_obtain_latest_monmap Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing) e3 sync_obtain_latest_monmap obtained monmap e3 Oct 14 06:01:53 localhost systemd[1]: tmp-crun.ox0Fem.mount: Deactivated successfully. Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).mds e16 new map Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-14T08:11:54.831494+0000#012modified#0112025-10-14T10:00:48.835986+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01178#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26888}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26888 members: 26888#012[mds.mds.np0005486732.xkownj{0:26888} state up:active seq 13 addr [v2:172.18.0.107:6808/1205328170,v1:172.18.0.107:6809/1205328170] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005486733.tvstmf{-1:17244} state up:standby seq 1 addr [v2:172.18.0.108:6808/3626555326,v1:172.18.0.108:6809/3626555326] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005486731.onyaog{-1:17256} state up:standby seq 1 addr [v2:172.18.0.106:6808/799411272,v1:172.18.0.106:6809/799411272] compat {c=[1],r=[1],i=[17ff]}] Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).osd e79 crush map has features 3314933000852226048, adjusting msgr requires Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).osd e79 crush map has features 288514051259236352, adjusting msgr requires Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).osd e79 crush map has features 288514051259236352, adjusting msgr requires Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).osd e79 crush map has features 288514051259236352, adjusting msgr requires Oct 14 06:01:53 localhost ceph-mon[303906]: Removing key for mds.mds.np0005486730.hzolgi Oct 14 06:01:53 localhost ceph-mon[303906]: Removing daemon mds.mds.np0005486729.iznaug from np0005486729.localdomain -- ports [] Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005486729.iznaug"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005486729.iznaug"}]': finished Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Removing key for mds.mds.np0005486729.iznaug Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mgr to host np0005486731.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mgr to host np0005486732.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mgr to host np0005486733.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Saving service mgr spec with placement label:mgr Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Oct 14 06:01:53 localhost ceph-mon[303906]: Deploying daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Oct 14 06:01:53 localhost ceph-mon[303906]: Deploying daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mon to host np0005486728.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label _admin to host np0005486728.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Oct 14 06:01:53 localhost ceph-mon[303906]: Deploying daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mon to host np0005486729.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label _admin to host np0005486729.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mon to host np0005486730.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label _admin to host np0005486730.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mon to host np0005486731.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: Added label _admin to host np0005486731.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mon to host np0005486732.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Added label _admin to host np0005486732.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:01:53 localhost ceph-mon[303906]: Added label mon to host np0005486733.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:01:53 localhost ceph-mon[303906]: Added label _admin to host np0005486733.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Saving service mon spec with placement label:mon Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:01:53 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:53 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:01:53 localhost ceph-mon[303906]: Deploying daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:01:53 localhost ceph-mon[303906]: mon.np0005486733@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Oct 14 06:01:53 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b96f20 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Oct 14 06:01:53 localhost nova_compute[297686]: 2025-10-14 10:01:53.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:55 localhost ceph-mon[303906]: mon.np0005486733@-1(probing) e4 my rank is now 3 (was -1) Oct 14 06:01:55 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:01:55 localhost ceph-mon[303906]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 Oct 14 06:01:55 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e4 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:01:56 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Oct 14 06:01:56 localhost nova_compute[297686]: 2025-10-14 10:01:56.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:56 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Oct 14 06:01:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:01:57.770 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:01:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:01:57.771 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:01:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:01:57.771 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:01:58 localhost podman[248187]: time="2025-10-14T10:01:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:01:58 localhost podman[248187]: @ - - [14/Oct/2025:10:01:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:01:58 localhost podman[248187]: @ - - [14/Oct/2025:10:01:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19836 "" "Go-http-client/1.1" Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e4 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Oct 14 06:01:58 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:58 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:58 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:58 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e4 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:01:58 localhost ceph-mon[303906]: mgrc update_daemon_metadata mon.np0005486733 metadata {addrs=[v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005486733.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005486733.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486728 calling monitor election Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:01:58 localhost ceph-mon[303906]: mon.np0005486728 is new leader, mons np0005486728,np0005486730,np0005486729,np0005486733 in quorum (ranks 0,1,2,3) Oct 14 06:01:58 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:01:58 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:58 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:58 localhost nova_compute[297686]: 2025-10-14 10:01:58.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:01:59 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:01:59 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:01:59 localhost ceph-mon[303906]: Deploying daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:02:00 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Oct 14 06:02:00 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97080 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Oct 14 06:02:00 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:02:00 localhost ceph-mon[303906]: paxos.3).electionLogic(18) init, last seen epoch 18 Oct 14 06:02:00 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:00 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:01 localhost nova_compute[297686]: 2025-10-14 10:02:01.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:01 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Oct 14 06:02:01 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Oct 14 06:02:01 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Oct 14 06:02:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:02:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:02:03 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Oct 14 06:02:03 localhost systemd[1]: tmp-crun.2g9USG.mount: Deactivated successfully. Oct 14 06:02:03 localhost podman[303945]: 2025-10-14 10:02:03.75482634 +0000 UTC m=+0.090298877 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:02:03 localhost podman[303945]: 2025-10-14 10:02:03.769017083 +0000 UTC m=+0.104489590 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:02:03 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:02:03 localhost nova_compute[297686]: 2025-10-14 10:02:03.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:03 localhost podman[303946]: 2025-10-14 10:02:03.887932444 +0000 UTC m=+0.219785141 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:02:03 localhost podman[303946]: 2025-10-14 10:02:03.919723614 +0000 UTC m=+0.251576351 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:02:03 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486728 calling monitor election Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486728 is new leader, mons np0005486728,np0005486730,np0005486729,np0005486733,np0005486732 in quorum (ranks 0,1,2,3,4) Oct 14 06:02:05 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:02:05 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:05 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.666733) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436125666841, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 9832, "num_deletes": 255, "total_data_size": 10795757, "memory_usage": 11021808, "flush_reason": "Manual Compaction"} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436125718002, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 9548295, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 9837, "table_properties": {"data_size": 9493761, "index_size": 29788, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23685, "raw_key_size": 249313, "raw_average_key_size": 26, "raw_value_size": 9331408, "raw_average_value_size": 987, "num_data_blocks": 1146, "num_entries": 9453, "num_filter_entries": 9453, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 1760436113, "file_creation_time": 1760436125, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 51339 microseconds, and 22680 cpu microseconds. Oct 14 06:02:05 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b971e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.718078) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 9548295 bytes OK Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.718108) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.720105) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.720131) EVENT_LOG_v1 {"time_micros": 1760436125720125, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.720152) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 10727422, prev total WAL file size 10730691, number of live WAL files 2. Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.721990) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(9324KB) 8(1887B)] Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436125722111, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 9550182, "oldest_snapshot_seqno": -1} Oct 14 06:02:05 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:02:05 localhost ceph-mon[303906]: paxos.3).electionLogic(22) init, last seen epoch 22 Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:05 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9201 keys, 9544387 bytes, temperature: kUnknown Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436125781273, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 9544387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9490521, "index_size": 29765, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23045, "raw_key_size": 244506, "raw_average_key_size": 26, "raw_value_size": 9331466, "raw_average_value_size": 1014, "num_data_blocks": 1145, "num_entries": 9201, "num_filter_entries": 9201, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436125, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.781547) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 9544387 bytes Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.783546) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.2 rd, 161.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(9.1, 0.0 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9458, records dropped: 257 output_compression: NoCompression Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.783578) EVENT_LOG_v1 {"time_micros": 1760436125783565, "job": 4, "event": "compaction_finished", "compaction_time_micros": 59226, "compaction_time_cpu_micros": 30818, "output_level": 6, "num_output_files": 1, "total_output_size": 9544387, "num_input_records": 9458, "num_output_records": 9201, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436125784942, "job": 4, "event": "table_file_deletion", "file_number": 14} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436125784998, "job": 4, "event": "table_file_deletion", "file_number": 8} Oct 14 06:02:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:02:05.721893) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:02:06 localhost nova_compute[297686]: 2025-10-14 10:02:06.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:07 localhost podman[304113]: 2025-10-14 10:02:07.258240002 +0000 UTC m=+0.073449684 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, release=553, ceph=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:02:07 localhost podman[304113]: 2025-10-14 10:02:07.342074041 +0000 UTC m=+0.157283703 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Oct 14 06:02:08 localhost openstack_network_exporter[250374]: ERROR 10:02:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:02:08 localhost openstack_network_exporter[250374]: ERROR 10:02:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:02:08 localhost openstack_network_exporter[250374]: ERROR 10:02:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:02:08 localhost openstack_network_exporter[250374]: ERROR 10:02:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:02:08 localhost openstack_network_exporter[250374]: Oct 14 06:02:08 localhost openstack_network_exporter[250374]: ERROR 10:02:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:02:08 localhost openstack_network_exporter[250374]: Oct 14 06:02:08 localhost nova_compute[297686]: 2025-10-14 10:02:08.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486728 calling monitor election Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486728 is new leader, mons np0005486728,np0005486730,np0005486729,np0005486733 in quorum (ranks 0,1,2,3) Oct 14 06:02:10 localhost ceph-mon[303906]: Health check failed: 2/6 mons down, quorum np0005486728,np0005486730,np0005486729,np0005486733 (MON_DOWN) Oct 14 06:02:10 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 2/6 mons down, quorum np0005486728,np0005486730,np0005486729,np0005486733 Oct 14 06:02:10 localhost ceph-mon[303906]: [WRN] MON_DOWN: 2/6 mons down, quorum np0005486728,np0005486730,np0005486729,np0005486733 Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486732 (rank 4) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Oct 14 06:02:10 localhost ceph-mon[303906]: mon.np0005486731 (rank 5) addr [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] is down (out of quorum) Oct 14 06:02:10 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:10 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e6 handle_auth_request failed to assign global_id Oct 14 06:02:11 localhost nova_compute[297686]: 2025-10-14 10:02:11.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:11 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: paxos.3).electionLogic(25) init, last seen epoch 25, mid-election, bumping Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486733@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486728 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:02:11 localhost ceph-mon[303906]: mon.np0005486728 is new leader, mons np0005486728,np0005486730,np0005486729,np0005486733,np0005486732,np0005486731 in quorum (ranks 0,1,2,3,4,5) Oct 14 06:02:11 localhost ceph-mon[303906]: Health check cleared: MON_DOWN (was: 2/6 mons down, quorum np0005486728,np0005486730,np0005486729,np0005486733) Oct 14 06:02:11 localhost ceph-mon[303906]: Cluster is now healthy Oct 14 06:02:11 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:02:11 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:02:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:02:12 localhost podman[304321]: 2025-10-14 10:02:12.049359363 +0000 UTC m=+0.091365701 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=iscsid) Oct 14 06:02:12 localhost podman[304321]: 2025-10-14 10:02:12.086052763 +0000 UTC m=+0.128059111 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0) Oct 14 06:02:12 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486728.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:12 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:13 localhost nova_compute[297686]: 2025-10-14 10:02:13.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:13 localhost nova_compute[297686]: 2025-10-14 10:02:13.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:02:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:02:14 localhost podman[304662]: 2025-10-14 10:02:14.023583942 +0000 UTC m=+0.083697736 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:02:14 localhost podman[304662]: 2025-10-14 10:02:14.036127384 +0000 UTC m=+0.096241198 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:02:14 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:02:14 localhost systemd[1]: tmp-crun.TkSSMf.mount: Deactivated successfully. Oct 14 06:02:14 localhost podman[304661]: 2025-10-14 10:02:14.092919888 +0000 UTC m=+0.154034433 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:02:14 localhost podman[304661]: 2025-10-14 10:02:14.135237221 +0000 UTC m=+0.196351766 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:02:14 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:02:14 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:14 localhost ceph-mon[303906]: Updating np0005486728.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:14 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:14 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:14 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:14 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:02:15 localhost nova_compute[297686]: 2025-10-14 10:02:15.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:15 localhost ceph-mon[303906]: Reconfiguring mon.np0005486728 (monmap changed)... Oct 14 06:02:15 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486728 on np0005486728.localdomain Oct 14 06:02:15 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:15 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:15 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486728.giajub", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:16 localhost nova_compute[297686]: 2025-10-14 10:02:16.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:16 localhost nova_compute[297686]: 2025-10-14 10:02:16.252 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:16 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486728.giajub (monmap changed)... Oct 14 06:02:16 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486728.giajub on np0005486728.localdomain Oct 14 06:02:16 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:16 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:16 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486728.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:16 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:16 localhost nova_compute[297686]: 2025-10-14 10:02:16.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:17 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e6 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0) Oct 14 06:02:17 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.103:0/1436413359' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.303 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.303 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.303 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.304 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.304 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:02:17 localhost ceph-mon[303906]: Reconfiguring crash.np0005486728 (monmap changed)... Oct 14 06:02:17 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486728 on np0005486728.localdomain Oct 14 06:02:17 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:17 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:17 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:17 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:02:17 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/160518194' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.782 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.850 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.850 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.972 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.973 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11446MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.973 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:02:17 localhost nova_compute[297686]: 2025-10-14 10:02:17.974 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.305 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.305 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.306 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:02:18 localhost ceph-mon[303906]: Reconfiguring crash.np0005486729 (monmap changed)... Oct 14 06:02:18 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486729 on np0005486729.localdomain Oct 14 06:02:18 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:18 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:18 localhost ceph-mon[303906]: Reconfiguring mon.np0005486729 (monmap changed)... Oct 14 06:02:18 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:02:18 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486729 on np0005486729.localdomain Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.339 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:02:18 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:02:18 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1922163583' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.799 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.806 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.823 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.826 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.827 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:02:18 localhost ceph-mon[303906]: mon.np0005486733@3(peon).osd e79 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Oct 14 06:02:18 localhost ceph-mon[303906]: mon.np0005486733@3(peon).osd e79 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Oct 14 06:02:18 localhost ceph-mon[303906]: mon.np0005486733@3(peon).osd e80 e80: 6 total, 6 up, 6 in Oct 14 06:02:18 localhost systemd[1]: session-24.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd[1]: session-17.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd-logind[760]: Session 24 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd[1]: session-20.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd[1]: session-26.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd[1]: session-26.scope: Consumed 3min 31.000s CPU time. Oct 14 06:02:18 localhost systemd-logind[760]: Session 17 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd[1]: session-16.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd[1]: session-22.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd[1]: session-23.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd-logind[760]: Session 26 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Session 20 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Session 22 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Session 16 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Session 23 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost nova_compute[297686]: 2025-10-14 10:02:18.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:18 localhost systemd[1]: session-14.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 24. Oct 14 06:02:18 localhost systemd[1]: session-19.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd[1]: session-25.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd-logind[760]: Session 14 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd[1]: session-21.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd-logind[760]: Session 25 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Session 19 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Session 21 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 17. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 20. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 26. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 16. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 22. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 23. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 14. Oct 14 06:02:18 localhost systemd[1]: session-18.scope: Deactivated successfully. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 19. Oct 14 06:02:18 localhost systemd-logind[760]: Session 18 logged out. Waiting for processes to exit. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 25. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 21. Oct 14 06:02:18 localhost systemd-logind[760]: Removed session 18. Oct 14 06:02:19 localhost sshd[304747]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:02:19 localhost systemd-logind[760]: New session 67 of user ceph-admin. Oct 14 06:02:19 localhost systemd[1]: Started Session 67 of User ceph-admin. Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' Oct 14 06:02:19 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486729.xpybho (monmap changed)... Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14120 172.18.0.103:0/1668635823' entity='mgr.np0005486728.giajub' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:19 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486729.xpybho on np0005486729.localdomain Oct 14 06:02:19 localhost ceph-mon[303906]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:02:19 localhost ceph-mon[303906]: Activating manager daemon np0005486730.ddfidc Oct 14 06:02:19 localhost ceph-mon[303906]: from='client.? 172.18.0.103:0/3139066435' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:02:19 localhost ceph-mon[303906]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Oct 14 06:02:19 localhost ceph-mon[303906]: Manager daemon np0005486730.ddfidc is now available Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486730.ddfidc/mirror_snapshot_schedule"} : dispatch Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486730.ddfidc/mirror_snapshot_schedule"} : dispatch Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486730.ddfidc/trash_purge_schedule"} : dispatch Oct 14 06:02:19 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486730.ddfidc/trash_purge_schedule"} : dispatch Oct 14 06:02:19 localhost nova_compute[297686]: 2025-10-14 10:02:19.827 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:20 localhost nova_compute[297686]: 2025-10-14 10:02:20.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:20 localhost nova_compute[297686]: 2025-10-14 10:02:20.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:02:20 localhost nova_compute[297686]: 2025-10-14 10:02:20.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:02:20 localhost podman[304857]: 2025-10-14 10:02:20.387078473 +0000 UTC m=+0.074549936 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Oct 14 06:02:20 localhost podman[304857]: 2025-10-14 10:02:20.529205592 +0000 UTC m=+0.216677075 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, version=7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True) Oct 14 06:02:20 localhost ceph-mon[303906]: [14/Oct/2025:10:02:20] ENGINE Bus STARTING Oct 14 06:02:20 localhost ceph-mon[303906]: [14/Oct/2025:10:02:20] ENGINE Serving on https://172.18.0.105:7150 Oct 14 06:02:20 localhost ceph-mon[303906]: [14/Oct/2025:10:02:20] ENGINE Client ('172.18.0.105', 34948) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:02:20 localhost ceph-mon[303906]: [14/Oct/2025:10:02:20] ENGINE Serving on http://172.18.0.105:8765 Oct 14 06:02:20 localhost ceph-mon[303906]: [14/Oct/2025:10:02:20] ENGINE Bus STARTED Oct 14 06:02:20 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:20 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:20 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:20 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.240 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.241 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.242 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.243 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.650 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.666 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.667 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:02:21 localhost nova_compute[297686]: 2025-10-14 10:02:21.668 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:02:21 localhost systemd[1]: tmp-crun.pDQnUV.mount: Deactivated successfully. Oct 14 06:02:21 localhost podman[305043]: 2025-10-14 10:02:21.745865314 +0000 UTC m=+0.076078464 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, distribution-scope=public) Oct 14 06:02:21 localhost podman[305044]: 2025-10-14 10:02:21.796345785 +0000 UTC m=+0.120008414 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:02:21 localhost podman[305043]: 2025-10-14 10:02:21.806966209 +0000 UTC m=+0.137179339 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41) Oct 14 06:02:21 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:02:21 localhost podman[305042]: 2025-10-14 10:02:21.777230682 +0000 UTC m=+0.108678259 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:02:21 localhost podman[305044]: 2025-10-14 10:02:21.858828533 +0000 UTC m=+0.182491152 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:02:21 localhost podman[305042]: 2025-10-14 10:02:21.861282698 +0000 UTC m=+0.192730285 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:02:21 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:02:21 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:21 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd/host:np0005486728", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd/host:np0005486728", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:02:23 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd/host:np0005486729", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd/host:np0005486729", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:02:23 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:02:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:02:23 localhost ceph-mon[303906]: Updating np0005486728.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:23 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:23 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:23 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:23 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:23 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:23 localhost ceph-mon[303906]: mon.np0005486733@3(peon).osd e80 _set_new_cache_sizes cache_size:1019837851 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:24 localhost nova_compute[297686]: 2025-10-14 10:02:24.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:24 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:24 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:24 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:24 localhost ceph-mon[303906]: Updating np0005486728.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:24 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:24 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486728.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: Updating np0005486728.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:25 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:26 localhost nova_compute[297686]: 2025-10-14 10:02:26.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:26 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:26 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:26 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:26 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:26 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:27 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486729.xpybho (monmap changed)... Oct 14 06:02:27 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486729.xpybho on np0005486729.localdomain Oct 14 06:02:27 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:27 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:27 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:02:28 localhost podman[248187]: time="2025-10-14T10:02:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:02:28 localhost podman[248187]: @ - - [14/Oct/2025:10:02:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:02:28 localhost ceph-mon[303906]: mon.np0005486733@3(peon).osd e80 _set_new_cache_sizes cache_size:1020050806 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:28 localhost podman[248187]: @ - - [14/Oct/2025:10:02:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19831 "" "Go-http-client/1.1" Oct 14 06:02:28 localhost ceph-mon[303906]: Reconfiguring mon.np0005486730 (monmap changed)... Oct 14 06:02:28 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486730 on np0005486730.localdomain Oct 14 06:02:28 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:28 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:28 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:02:28 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:28 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:28 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:02:29 localhost nova_compute[297686]: 2025-10-14 10:02:29.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:30 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:30 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:30 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:30 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:02:30 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:30 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:30 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:02:31 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:31 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:31 localhost ceph-mon[303906]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:02:31 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:31 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:31 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:02:31 localhost nova_compute[297686]: 2025-10-14 10:02:31.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:32 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:32 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:32 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:02:32 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:02:32 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:02:33 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:33 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:33 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:02:33 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:02:33 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:02:33 localhost ceph-mon[303906]: mon.np0005486733@3(peon).osd e80 _set_new_cache_sizes cache_size:1020054659 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:34 localhost nova_compute[297686]: 2025-10-14 10:02:34.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:34 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:34 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:34 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:02:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:02:34 localhost ceph-mon[303906]: from='mgr.14184 ' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:02:34 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:02:34 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97600 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Oct 14 06:02:34 localhost ceph-mon[303906]: mon.np0005486733@3(peon) e7 my rank is now 2 (was 3) Oct 14 06:02:34 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Oct 14 06:02:34 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Oct 14 06:02:34 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b971e0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Oct 14 06:02:34 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:02:34 localhost ceph-mon[303906]: paxos.2).electionLogic(28) init, last seen epoch 28 Oct 14 06:02:34 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:02:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:02:34 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:34 localhost podman[305821]: 2025-10-14 10:02:34.748487973 +0000 UTC m=+0.086482871 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:02:34 localhost podman[305820]: 2025-10-14 10:02:34.805348319 +0000 UTC m=+0.143801991 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:02:34 localhost podman[305820]: 2025-10-14 10:02:34.814534309 +0000 UTC m=+0.152987921 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:02:34 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:02:34 localhost podman[305821]: 2025-10-14 10:02:34.831493607 +0000 UTC m=+0.169488555 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:02:34 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:02:36 localhost nova_compute[297686]: 2025-10-14 10:02:36.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:38 localhost openstack_network_exporter[250374]: ERROR 10:02:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:02:38 localhost openstack_network_exporter[250374]: ERROR 10:02:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:02:38 localhost openstack_network_exporter[250374]: ERROR 10:02:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:02:38 localhost openstack_network_exporter[250374]: ERROR 10:02:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:02:38 localhost openstack_network_exporter[250374]: Oct 14 06:02:38 localhost openstack_network_exporter[250374]: ERROR 10:02:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:02:38 localhost openstack_network_exporter[250374]: Oct 14 06:02:39 localhost nova_compute[297686]: 2025-10-14 10:02:39.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:39 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: paxos.2).electionLogic(31) init, last seen epoch 31, mid-election, bumping Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e7 handle_timecheck drop unexpected msg Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:02:39 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:39 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:02:39 localhost ceph-mon[303906]: Remove daemons mon.np0005486728 Oct 14 06:02:39 localhost ceph-mon[303906]: Safe to remove mon.np0005486728: new quorum should be ['np0005486730', 'np0005486729', 'np0005486733', 'np0005486732', 'np0005486731'] (from ['np0005486730', 'np0005486729', 'np0005486733', 'np0005486732', 'np0005486731']) Oct 14 06:02:39 localhost ceph-mon[303906]: Removing monitor np0005486728 from monmap... Oct 14 06:02:39 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "mon rm", "name": "np0005486728"} : dispatch Oct 14 06:02:39 localhost ceph-mon[303906]: Removing daemon mon.np0005486728 from np0005486728.localdomain -- ports [] Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486729,np0005486733,np0005486731 in quorum (ranks 0,1,2,4) Oct 14 06:02:39 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:02:39 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486729,np0005486733,np0005486732,np0005486731 in quorum (ranks 0,1,2,3,4) Oct 14 06:02:39 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:02:39 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:39 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:39 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:02:40 localhost ceph-mon[303906]: Reconfiguring mon.np0005486731 (monmap changed)... Oct 14 06:02:40 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:02:40 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:40 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:40 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:02:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5063 writes, 22K keys, 5063 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5063 writes, 690 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 93 writes, 279 keys, 93 commit groups, 1.0 writes per commit group, ingest: 0.34 MB, 0.00 MB/s#012Interval WAL: 93 writes, 36 syncs, 2.58 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 06:02:41 localhost nova_compute[297686]: 2025-10-14 10:02:41.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:41 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:02:41 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:02:41 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:41 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:41 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:41 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:02:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:02:42 localhost podman[305861]: 2025-10-14 10:02:42.710728872 +0000 UTC m=+0.056097174 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:02:42 localhost podman[305861]: 2025-10-14 10:02:42.720932964 +0000 UTC m=+0.066301266 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:02:42 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:02:42 localhost ceph-mon[303906]: Removed label mon from host np0005486728.localdomain Oct 14 06:02:42 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:02:42 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:02:42 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:42 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:42 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:42 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:02:43 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054730 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:43 localhost ceph-mon[303906]: Removed label mgr from host np0005486728.localdomain Oct 14 06:02:43 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:02:43 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:02:43 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:43 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:43 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:02:43 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:44 localhost nova_compute[297686]: 2025-10-14 10:02:44.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:02:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:02:44 localhost podman[305880]: 2025-10-14 10:02:44.707299313 +0000 UTC m=+0.049269955 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible) Oct 14 06:02:44 localhost podman[305880]: 2025-10-14 10:02:44.72061355 +0000 UTC m=+0.062584192 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:02:44 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:02:44 localhost podman[305881]: 2025-10-14 10:02:44.762335583 +0000 UTC m=+0.099039154 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:02:44 localhost podman[305881]: 2025-10-14 10:02:44.801043095 +0000 UTC m=+0.137746596 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:02:44 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:02:44 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:02:44 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:02:44 localhost ceph-mon[303906]: Removed label _admin from host np0005486728.localdomain Oct 14 06:02:44 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:44 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:44 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:02:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5676 writes, 24K keys, 5676 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5676 writes, 824 syncs, 6.89 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 115 writes, 319 keys, 115 commit groups, 1.0 writes per commit group, ingest: 0.46 MB, 0.00 MB/s#012Interval WAL: 115 writes, 56 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 06:02:45 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:02:45 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:02:45 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:45 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:45 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:02:46 localhost nova_compute[297686]: 2025-10-14 10:02:46.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:46 localhost podman[305973]: Oct 14 06:02:46 localhost podman[305973]: 2025-10-14 10:02:46.844659542 +0000 UTC m=+0.071582307 container create e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_noether, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, name=rhceph, RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, distribution-scope=public, release=553, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:02:46 localhost systemd[1]: Started libpod-conmon-e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5.scope. Oct 14 06:02:46 localhost systemd[1]: Started libcrun container. Oct 14 06:02:46 localhost podman[305973]: 2025-10-14 10:02:46.811632834 +0000 UTC m=+0.038555639 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:02:46 localhost ceph-mon[303906]: Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:02:46 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:02:46 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:46 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:46 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:02:46 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:46 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:02:46 localhost podman[305973]: 2025-10-14 10:02:46.930878103 +0000 UTC m=+0.157800878 container init e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_noether, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7) Oct 14 06:02:46 localhost podman[305973]: 2025-10-14 10:02:46.941432666 +0000 UTC m=+0.168355451 container start e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_noether, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git) Oct 14 06:02:46 localhost podman[305973]: 2025-10-14 10:02:46.941632412 +0000 UTC m=+0.168555187 container attach e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_noether, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, release=553, name=rhceph, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=) Oct 14 06:02:46 localhost charming_noether[305988]: 167 167 Oct 14 06:02:46 localhost systemd[1]: libpod-e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5.scope: Deactivated successfully. Oct 14 06:02:46 localhost podman[305973]: 2025-10-14 10:02:46.952279487 +0000 UTC m=+0.179202262 container died e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_noether, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, version=7) Oct 14 06:02:47 localhost podman[305993]: 2025-10-14 10:02:47.032085484 +0000 UTC m=+0.071757632 container remove e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_noether, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553) Oct 14 06:02:47 localhost systemd[1]: libpod-conmon-e7bd18dbfe6108117729c61cedce5d1c2919e9c5dc846861eb91ea6af80c01c5.scope: Deactivated successfully. Oct 14 06:02:47 localhost podman[306062]: Oct 14 06:02:47 localhost podman[306062]: 2025-10-14 10:02:47.752600339 +0000 UTC m=+0.074857176 container create 808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_montalcini, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container, version=7) Oct 14 06:02:47 localhost systemd[1]: Started libpod-conmon-808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3.scope. Oct 14 06:02:47 localhost systemd[1]: Started libcrun container. Oct 14 06:02:47 localhost podman[306062]: 2025-10-14 10:02:47.806247577 +0000 UTC m=+0.128504404 container init 808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_montalcini, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Oct 14 06:02:47 localhost podman[306062]: 2025-10-14 10:02:47.814704285 +0000 UTC m=+0.136961112 container start 808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_montalcini, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, version=7, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True) Oct 14 06:02:47 localhost podman[306062]: 2025-10-14 10:02:47.815004214 +0000 UTC m=+0.137261061 container attach 808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_montalcini, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55) Oct 14 06:02:47 localhost reverent_montalcini[306077]: 167 167 Oct 14 06:02:47 localhost systemd[1]: libpod-808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3.scope: Deactivated successfully. Oct 14 06:02:47 localhost podman[306062]: 2025-10-14 10:02:47.817709037 +0000 UTC m=+0.139965944 container died 808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_montalcini, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55) Oct 14 06:02:47 localhost podman[306062]: 2025-10-14 10:02:47.724268754 +0000 UTC m=+0.046525662 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:02:47 localhost systemd[1]: var-lib-containers-storage-overlay-2a4798a4696e0707e32eb32c05f3c12ec3717942203320f253ce6b46e6f9c8da-merged.mount: Deactivated successfully. Oct 14 06:02:47 localhost systemd[1]: tmp-crun.jKXEOf.mount: Deactivated successfully. Oct 14 06:02:47 localhost systemd[1]: var-lib-containers-storage-overlay-381117d89ec194b8f340f84e55bc6da6b85bb4433422241bf7bfd13f22334d61-merged.mount: Deactivated successfully. Oct 14 06:02:47 localhost podman[306082]: 2025-10-14 10:02:47.897597916 +0000 UTC m=+0.075040552 container remove 808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_montalcini, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Oct 14 06:02:47 localhost systemd[1]: libpod-conmon-808228dd4ecc166f00786ce34cbcbd20bd04840e4ee68cd17eadd0c14c5306e3.scope: Deactivated successfully. Oct 14 06:02:48 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:48 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:48 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:02:48 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:02:48 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:02:48 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:48 localhost podman[306159]: Oct 14 06:02:48 localhost podman[306159]: 2025-10-14 10:02:48.750993067 +0000 UTC m=+0.075753263 container create c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_yalow, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7) Oct 14 06:02:48 localhost systemd[1]: Started libpod-conmon-c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7.scope. Oct 14 06:02:48 localhost systemd[1]: Started libcrun container. Oct 14 06:02:48 localhost podman[306159]: 2025-10-14 10:02:48.721431866 +0000 UTC m=+0.046192092 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:02:48 localhost podman[306159]: 2025-10-14 10:02:48.835151657 +0000 UTC m=+0.159911853 container init c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_yalow, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12) Oct 14 06:02:48 localhost podman[306159]: 2025-10-14 10:02:48.847910626 +0000 UTC m=+0.172670832 container start c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_yalow, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:02:48 localhost podman[306159]: 2025-10-14 10:02:48.850923318 +0000 UTC m=+0.175683554 container attach c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_yalow, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, io.openshift.tags=rhceph ceph, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public) Oct 14 06:02:48 localhost beautiful_yalow[306174]: 167 167 Oct 14 06:02:48 localhost systemd[1]: libpod-c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7.scope: Deactivated successfully. Oct 14 06:02:48 localhost podman[306159]: 2025-10-14 10:02:48.856367925 +0000 UTC m=+0.181128151 container died c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_yalow, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph) Oct 14 06:02:48 localhost systemd[1]: var-lib-containers-storage-overlay-bc514c57ecef7d76d85a8f1eb5b5621852b5d7c28dca00eb77989851476d68a4-merged.mount: Deactivated successfully. Oct 14 06:02:48 localhost podman[306180]: 2025-10-14 10:02:48.955458409 +0000 UTC m=+0.092280118 container remove c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_yalow, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph) Oct 14 06:02:48 localhost systemd[1]: libpod-conmon-c2b16a38adeb858a50a4c5e90c3d8bc4dba2ac53699a8c01e5dd929877a3c4b7.scope: Deactivated successfully. Oct 14 06:02:49 localhost nova_compute[297686]: 2025-10-14 10:02:49.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:49 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:49 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:02:49 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:02:49 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:02:49 localhost podman[306257]: Oct 14 06:02:49 localhost podman[306257]: 2025-10-14 10:02:49.787629873 +0000 UTC m=+0.078371363 container create e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_lewin, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.817 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.817 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:02:49 localhost systemd[1]: Started libpod-conmon-e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d.scope. Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.839 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.840 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5abac4d-31d8-406c-90fa-835a5e3ad0a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.818192', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f1091e12-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '30fda5d0702c6676902e416820d8082272c29e508aa2ea5e25ea8f7f6ffdac5d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.818192', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f10930f0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '98974cfed24969975cd72fe8b8d66bdb2fe44878c77aa0a7f7e662671b751a00'}]}, 'timestamp': '2025-10-14 10:02:49.840973', '_unique_id': '841cdc1bc8b944b3a030e1e1f0f802dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.843 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:02:49 localhost systemd[1]: Started libcrun container. Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.848 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f331b1e0-aaa0-474d-890c-03094a4748e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.844550', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f10a625e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '3e4534bb4367554efbc11798a60d7e966b70d2916df3cea695b0293cb9e8ccdc'}]}, 'timestamp': '2025-10-14 10:02:49.848772', '_unique_id': 'ba3a950e04354ed3b0f3d1abaa92568b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.851 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:02:49 localhost podman[306257]: 2025-10-14 10:02:49.755574505 +0000 UTC m=+0.046316045 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:02:49 localhost podman[306257]: 2025-10-14 10:02:49.857997551 +0000 UTC m=+0.148739041 container init e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_lewin, architecture=x86_64, distribution-scope=public, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-09-24T08:57:55, release=553, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:02:49 localhost podman[306257]: 2025-10-14 10:02:49.869300116 +0000 UTC m=+0.160041626 container start e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_lewin, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, version=7, ceph=True, CEPH_POINT_RELEASE=, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:02:49 localhost podman[306257]: 2025-10-14 10:02:49.869609707 +0000 UTC m=+0.160351247 container attach e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_lewin, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Oct 14 06:02:49 localhost boring_lewin[306272]: 167 167 Oct 14 06:02:49 localhost systemd[1]: libpod-e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d.scope: Deactivated successfully. Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.872 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost podman[306257]: 2025-10-14 10:02:49.873848986 +0000 UTC m=+0.164590566 container died e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_lewin, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, architecture=x86_64, release=553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a89f64fc-2d77-4aa4-951d-d287b9beedf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:02:49.851447', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f10e3316-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.065565392, 'message_signature': '8ef39075812c0c121d779e40cc46554739a77576a223aca57dba1a88ee6f120e'}]}, 'timestamp': '2025-10-14 10:02:49.873899', '_unique_id': '95d0c6165b4744c1969a3ce7e929bf81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.875 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.877 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.878 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c441b631-5c99-4f28-bc6f-580b1e8eca4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.877771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f10ee90a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '743bdea11beed2c68c0d5da46ba1c45ac5812c40da841905000c63a8e5653d0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.877771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f10efce2-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': 'ce57dc339fb90aec10307247c3d5cd15e70afeecc488d40b482ce4d0fb651936'}]}, 'timestamp': '2025-10-14 10:02:49.878943', '_unique_id': '9577d0dc3997458a8128b5cd301c28f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8db552eb-4d97-416a-a5a0-e27ebfb8ad8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.881303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f110f4ca-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.07400864, 'message_signature': '826c640aeef207515664505a29ff29c6ff7f899aa16ef6acd9f8c6e06d6c45c9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.881303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f1110910-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.07400864, 'message_signature': 'da5cde6cfd26ec7c03abcec55c3ad3f4ee9aeb795b1c384b2b6a64013a5fd7dd'}]}, 'timestamp': '2025-10-14 10:02:49.892287', '_unique_id': 'ed77125eb2804d079d4a08d0526f49a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eac639e7-2a19-45f3-809d-cb4f349e2bb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.894879', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f111808e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '6e8f619f830ccc0fe572d96740f157daf62d7fcbaf4f5319dd4410f057ea7157'}]}, 'timestamp': '2025-10-14 10:02:49.895375', '_unique_id': '729da52d2c194dce853ddf81811f76e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.898 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcc6d4c2-cab2-4831-acc8-ec5c9567aaa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.897645', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f111ed58-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': 'ae0c07572af5f548344dc7a4fb93bd7b5c0010210fefd2a869093c067ca53334'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.897645', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f111fe10-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '83868653299004866e9acec8091f9f2823e5e5da64d989bf8a24ba83b7ac9aea'}]}, 'timestamp': '2025-10-14 10:02:49.898553', '_unique_id': '5fe2338780124f5b82dcfd43592efb27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.900 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.901 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 12490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a9da278-4831-4d46-bfa0-73be9c9ac938', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12490000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:02:49.901020', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f1127034-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.065565392, 'message_signature': '0c7a1e2340c0474d9cf27953d03e6c0d42cd57082682fd104ba7dc0e56d65149'}]}, 'timestamp': '2025-10-14 10:02:49.901492', '_unique_id': 'a639f13eba3240ba9b9334a23ff86f53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7b26892-b4f6-474d-9953-58b9146b82c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.904128', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f112e5a0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '5e23fbd1b67ce2924a817d767f82e4726280e8565f4bb2e444af104930209671'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.904128', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f112f022-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '71b89e5147e1454d59269059ac141635533e2b5b5b95cc330a3276dbc51816c1'}]}, 'timestamp': '2025-10-14 10:02:49.904676', '_unique_id': '56dc6939265c4f1eaec1de368526247a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45da0fa4-4e62-412c-8d55-ea9a845afd92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.906049', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f11330a0-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': 'bce5952be3fb891d4e3f83b846f340802e322f856c00eedad9f9bea6282f4114'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.906049', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f1133ae6-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '031dfbdae728c4cb7e8afbca113e499eb7ae238cfe0cff786e19425d0a944fd0'}]}, 'timestamp': '2025-10-14 10:02:49.906584', '_unique_id': '01864dd270e04209a1c5496996d9b70a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77099fa3-14ec-48c9-812f-56be3026fbe5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.907971', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f1137c04-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '35093fb941f4bb892fb3449b9a83df469ddd26567b0b89189e8d4f456fcacd88'}]}, 'timestamp': '2025-10-14 10:02:49.908270', '_unique_id': '8bbc08eb96ba47e8a3498f38ce834ff3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bbfe88c-3039-4309-afeb-e3ca945c9428', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.909586', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f113bbba-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': 'd7ec5393b946d06fce106b47271a3c4d0517d3c4ba3c931112bce0477433b920'}]}, 'timestamp': '2025-10-14 10:02:49.909902', '_unique_id': '528419a320e74f0abd8c9f9141cb3bd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f786df25-5a6a-45a8-972d-70ff697f250f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.911202', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f113f9d6-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '5029865c37408f8b073b16513b5ca991c9e7986ebe13a4eeaaec036b3078235d'}]}, 'timestamp': '2025-10-14 10:02:49.911491', '_unique_id': 'adb522bc858c4818904e30f0cb6f06e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.913 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b6833c6-db09-474c-90be-8d5b19825bbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.912917', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f1143cac-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '3b56d518f7f05d73a2c44e319ffdbdb3bdb60ce1c07438805dddcbdf750f8aee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.912917', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f11446e8-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.010892892, 'message_signature': '3d11d7008919d5cd2e080faad6277719b5fc10948e84cd8cbafb316e9467478d'}]}, 'timestamp': '2025-10-14 10:02:49.913446', '_unique_id': '411151234f684a7da73112df51107b91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78428b5e-b0eb-474c-bf72-453356108ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.914927', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f1148b8a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '81fe72d927e01d96c20af15f9baaa604f8eb5e1d19b89d07d5baebaa6f0e266b'}]}, 'timestamp': '2025-10-14 10:02:49.915224', '_unique_id': 'c328285784204f1db0f7c6d1aab2134a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.916 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51ed0b5f-208e-49e8-8920-21c5d0c9178d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.916855', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f114d702-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '6bef9b062b20e29084835252b5b895eec7bc53471fbe9ff26e3e5dcef77a14d5'}]}, 'timestamp': '2025-10-14 10:02:49.917157', '_unique_id': '385483e5a38445b5a8f489aced220502'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e31a6826-7846-45a2-80ae-0280a1c6f8ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.918528', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f1151a6e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.07400864, 'message_signature': '0bd092bf54ffdb410f1fa8ce3de20fb9fd65fac8de087887450ceae39e9183c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.918528', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f11528ec-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.07400864, 'message_signature': '7bd2bfd228f42f0fc9d2942b6d1666f76521e46c576755d26561025ae5c36f96'}]}, 'timestamp': '2025-10-14 10:02:49.919240', '_unique_id': 'e17159ba281248b9a64a1f531c8278c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eccbd832-e08e-450d-9675-0f007eee656a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.920627', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f1156b2c-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '8306eb8fee0fe4f7b57877bf217c1b4d9d1321513c00dddcad8b51a18f4225ed'}]}, 'timestamp': '2025-10-14 10:02:49.920949', '_unique_id': 'da2752f886154d2e853dd013ce166ee2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529b8b75-99cb-4a72-bb2f-0b5057914390', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.922269', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f115aa42-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': '36d8bc2c4d1d29c8726cdd01d7c27b8a2772e9405ea2ba926a99e89a5daf0b66'}]}, 'timestamp': '2025-10-14 10:02:49.922564', '_unique_id': 'cdc5489b7c174757947e58694bf2cee7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd808e72c-3b9d-4b78-8a05-25de3c24bdc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:02:49.923957', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'f115ec32-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.037269048, 'message_signature': 'c7b2fb6c5a1fe02c356bf3226f84b902854eec561eb68e58e90691e79bd610c3'}]}, 'timestamp': '2025-10-14 10:02:49.924251', '_unique_id': 'd15cd732a931455eba1ede73dedfab51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0a77d17-e869-4f26-a273-544dd0217961', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:02:49.925553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f1162b8e-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.07400864, 'message_signature': '0864b467d85a96977bef958a4e4f642b5a3e4efdc79cf7176990fe8f97dfac34'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:02:49.925553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f116361a-a8e4-11f0-9707-fa163e99780b', 'monotonic_time': 12186.07400864, 'message_signature': '4cba43a996de3174cd4eefb927219c3dba36fb584478482781b4a59cb96e3b6d'}]}, 'timestamp': '2025-10-14 10:02:49.926124', '_unique_id': '3e5ee91da4534917963cdf0a909ad0a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:02:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:02:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:02:49 localhost systemd[1]: var-lib-containers-storage-overlay-558d87d2ee221c2de525c0c87613e54ad5fa442c4d79a30134824b786479ae0e-merged.mount: Deactivated successfully. Oct 14 06:02:49 localhost podman[306277]: 2025-10-14 10:02:49.986842205 +0000 UTC m=+0.100453777 container remove e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_lewin, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, distribution-scope=public, build-date=2025-09-24T08:57:55, release=553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git) Oct 14 06:02:49 localhost systemd[1]: libpod-conmon-e33a224e9fb139b580640412d6442b714872f3d3a0ff1dbf0a191e9675628e5d.scope: Deactivated successfully. Oct 14 06:02:50 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:50 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:50 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:02:50 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:02:50 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:02:50 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:50 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:50 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:02:50 localhost podman[306345]: Oct 14 06:02:50 localhost podman[306345]: 2025-10-14 10:02:50.709510476 +0000 UTC m=+0.067423569 container create d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_chandrasekhar, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main) Oct 14 06:02:50 localhost systemd[1]: Started libpod-conmon-d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e.scope. Oct 14 06:02:50 localhost systemd[1]: Started libcrun container. Oct 14 06:02:50 localhost podman[306345]: 2025-10-14 10:02:50.770397025 +0000 UTC m=+0.128310108 container init d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_chandrasekhar, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container) Oct 14 06:02:50 localhost podman[306345]: 2025-10-14 10:02:50.779640127 +0000 UTC m=+0.137553220 container start d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:02:50 localhost podman[306345]: 2025-10-14 10:02:50.779924586 +0000 UTC m=+0.137837719 container attach d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_chandrasekhar, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main) Oct 14 06:02:50 localhost jovial_chandrasekhar[306360]: 167 167 Oct 14 06:02:50 localhost podman[306345]: 2025-10-14 10:02:50.68636449 +0000 UTC m=+0.044277573 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:02:50 localhost systemd[1]: libpod-d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e.scope: Deactivated successfully. Oct 14 06:02:50 localhost podman[306345]: 2025-10-14 10:02:50.79088878 +0000 UTC m=+0.148801943 container died d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_chandrasekhar, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True) Oct 14 06:02:50 localhost systemd[1]: var-lib-containers-storage-overlay-180849b6e08f7e39f090512c468f2de9fedc1eebaf1c8a041140e0564c24c8d1-merged.mount: Deactivated successfully. Oct 14 06:02:50 localhost podman[306365]: 2025-10-14 10:02:50.882183278 +0000 UTC m=+0.081052856 container remove d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_chandrasekhar, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, vcs-type=git, RELEASE=main, name=rhceph) Oct 14 06:02:50 localhost systemd[1]: libpod-conmon-d35f7cf5bcffec7f6146d5143fa14cc54985eb20a9991a975d708772e074376e.scope: Deactivated successfully. Oct 14 06:02:51 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:02:51 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:02:51 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:51 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:51 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:02:51 localhost podman[306434]: Oct 14 06:02:51 localhost podman[306434]: 2025-10-14 10:02:51.598080093 +0000 UTC m=+0.073417113 container create 782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_heisenberg, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, release=553, version=7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Oct 14 06:02:51 localhost systemd[1]: Started libpod-conmon-782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561.scope. Oct 14 06:02:51 localhost systemd[1]: Started libcrun container. Oct 14 06:02:51 localhost podman[306434]: 2025-10-14 10:02:51.653545716 +0000 UTC m=+0.128882746 container init 782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_heisenberg, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=) Oct 14 06:02:51 localhost podman[306434]: 2025-10-14 10:02:51.662893222 +0000 UTC m=+0.138230272 container start 782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_heisenberg, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public) Oct 14 06:02:51 localhost podman[306434]: 2025-10-14 10:02:51.663217761 +0000 UTC m=+0.138554821 container attach 782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_heisenberg, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:02:51 localhost charming_heisenberg[306449]: 167 167 Oct 14 06:02:51 localhost systemd[1]: libpod-782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561.scope: Deactivated successfully. Oct 14 06:02:51 localhost podman[306434]: 2025-10-14 10:02:51.665720508 +0000 UTC m=+0.141057588 container died 782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_heisenberg, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, GIT_BRANCH=main, architecture=x86_64, ceph=True, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:02:51 localhost podman[306434]: 2025-10-14 10:02:51.573996458 +0000 UTC m=+0.049333518 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:02:51 localhost nova_compute[297686]: 2025-10-14 10:02:51.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:51 localhost podman[306454]: 2025-10-14 10:02:51.824655759 +0000 UTC m=+0.097869359 container remove 782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_heisenberg, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Oct 14 06:02:51 localhost systemd[1]: libpod-conmon-782b2494a648d1b3af299d41e546d9c9057be1ee762ac2d6a83511a81d918561.scope: Deactivated successfully. Oct 14 06:02:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:02:51 localhost systemd[1]: var-lib-containers-storage-overlay-1c5b97f4006f274be6ee1d9415439db8a7c7be71e5320acbf40a1c55db082d84-merged.mount: Deactivated successfully. Oct 14 06:02:51 localhost podman[306470]: 2025-10-14 10:02:51.93438485 +0000 UTC m=+0.074586139 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public) Oct 14 06:02:51 localhost podman[306470]: 2025-10-14 10:02:51.94784883 +0000 UTC m=+0.088050119 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6) Oct 14 06:02:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:02:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:02:51 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:02:52 localhost systemd[1]: tmp-crun.560CcO.mount: Deactivated successfully. Oct 14 06:02:52 localhost podman[306490]: 2025-10-14 10:02:52.032484695 +0000 UTC m=+0.075128375 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller) Oct 14 06:02:52 localhost podman[306491]: 2025-10-14 10:02:52.088464283 +0000 UTC m=+0.125324987 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:02:52 localhost podman[306490]: 2025-10-14 10:02:52.11620918 +0000 UTC m=+0.158852730 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:02:52 localhost podman[306491]: 2025-10-14 10:02:52.12930555 +0000 UTC m=+0.166166264 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:02:52 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:02:52 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:02:52 localhost ceph-mon[303906]: Reconfiguring mon.np0005486733 (monmap changed)... Oct 14 06:02:52 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:02:52 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:52 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:54 localhost nova_compute[297686]: 2025-10-14 10:02:54.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:54 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:54 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:54 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:02:54 localhost ceph-mon[303906]: Removing np0005486728.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:54 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:54 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:54 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:54 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:54 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:02:54 localhost ceph-mon[303906]: Removing np0005486728.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:02:54 localhost ceph-mon[303906]: Removing np0005486728.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:02:54 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:54 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:55 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:55 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:55 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:55 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:55 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:56 localhost nova_compute[297686]: 2025-10-14 10:02:56.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:56 localhost ceph-mon[303906]: Removing daemon mgr.np0005486728.giajub from np0005486728.localdomain -- ports [9283, 8765] Oct 14 06:02:56 localhost ceph-mon[303906]: Added label _no_schedule to host np0005486728.localdomain Oct 14 06:02:56 localhost ceph-mon[303906]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005486728.localdomain Oct 14 06:02:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:02:57.772 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:02:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:02:57.772 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:02:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:02:57.773 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:02:57 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth rm", "entity": "mgr.np0005486728.giajub"} : dispatch Oct 14 06:02:57 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005486728.giajub"}]': finished Oct 14 06:02:57 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:57 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:58 localhost podman[248187]: time="2025-10-14T10:02:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:02:58 localhost podman[248187]: @ - - [14/Oct/2025:10:02:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:02:58 localhost podman[248187]: @ - - [14/Oct/2025:10:02:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19834 "" "Go-http-client/1.1" Oct 14 06:02:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:02:58 localhost ceph-mon[303906]: Removing key for mgr.np0005486728.giajub Oct 14 06:02:58 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:58 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain"} : dispatch Oct 14 06:02:58 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain"}]': finished Oct 14 06:02:58 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:02:58 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:58 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:02:59 localhost nova_compute[297686]: 2025-10-14 10:02:59.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:02:59 localhost ceph-mon[303906]: host np0005486728.localdomain `cephadm ls` failed: Cannot decode JSON: #012Traceback (most recent call last):#012 File "/usr/share/ceph/mgr/cephadm/serve.py", line 1540, in _run_cephadm_json#012 return json.loads(''.join(out))#012 File "/lib64/python3.9/json/__init__.py", line 346, in loads#012 return _default_decoder.decode(s)#012 File "/lib64/python3.9/json/decoder.py", line 337, in decode#012 obj, end = self.raw_decode(s, idx=_w(s, 0).end())#012 File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode#012 raise JSONDecodeError("Expecting value", s, err.value) from None#012json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) Oct 14 06:02:59 localhost ceph-mon[303906]: Removed host np0005486728.localdomain Oct 14 06:02:59 localhost ceph-mon[303906]: executing refresh((['np0005486728.localdomain', 'np0005486729.localdomain', 'np0005486730.localdomain', 'np0005486731.localdomain', 'np0005486732.localdomain', 'np0005486733.localdomain'],)) failed.#012Traceback (most recent call last):#012 File "/usr/share/ceph/mgr/cephadm/utils.py", line 94, in do_work#012 return f(*arg)#012 File "/usr/share/ceph/mgr/cephadm/serve.py", line 317, in refresh#012 and not self.mgr.inventory.has_label(host, SpecialHostLabels.NO_MEMORY_AUTOTUNE)#012 File "/usr/share/ceph/mgr/cephadm/inventory.py", line 253, in has_label#012 host = self._get_stored_name(host)#012 File "/usr/share/ceph/mgr/cephadm/inventory.py", line 181, in _get_stored_name#012 self.assert_host(host)#012 File "/usr/share/ceph/mgr/cephadm/inventory.py", line 209, in assert_host#012 raise OrchestratorError('host %s does not exist' % host)#012orchestrator._interface.OrchestratorError: host np0005486728.localdomain does not exist Oct 14 06:02:59 localhost ceph-mon[303906]: Reconfiguring crash.np0005486729 (monmap changed)... Oct 14 06:02:59 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486729 on np0005486729.localdomain Oct 14 06:02:59 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:59 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:59 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:02:59 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:00 localhost ceph-mon[303906]: Reconfiguring mon.np0005486729 (monmap changed)... Oct 14 06:03:00 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486729 on np0005486729.localdomain Oct 14 06:03:00 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:00 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:00 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:01 localhost nova_compute[297686]: 2025-10-14 10:03:01.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:01 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486729.xpybho (monmap changed)... Oct 14 06:03:01 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486729.xpybho on np0005486729.localdomain Oct 14 06:03:01 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:01 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:01 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:02 localhost ceph-mon[303906]: Reconfiguring mon.np0005486730 (monmap changed)... Oct 14 06:03:02 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486730 on np0005486730.localdomain Oct 14 06:03:02 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:02 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:02 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:03:02 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:02 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:03:03 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:04 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:04 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:04 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:03:04 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:04 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:03:04 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:04 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:04 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:04 localhost nova_compute[297686]: 2025-10-14 10:03:04.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:05 localhost ceph-mon[303906]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:03:05 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:03:05 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:05 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:05 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:03:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:03:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:03:05 localhost podman[306891]: 2025-10-14 10:03:05.750253646 +0000 UTC m=+0.080075175 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:03:05 localhost podman[306891]: 2025-10-14 10:03:05.759353085 +0000 UTC m=+0.089174654 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:03:05 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:03:05 localhost podman[306892]: 2025-10-14 10:03:05.810525337 +0000 UTC m=+0.136701935 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 06:03:05 localhost podman[306892]: 2025-10-14 10:03:05.821217173 +0000 UTC m=+0.147393761 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:03:05 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:03:06 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:03:06 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:03:06 localhost ceph-mon[303906]: Saving service mon spec with placement label:mon Oct 14 06:03:06 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:06 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:06 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:06 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.685707) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436186685744, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 3193, "num_deletes": 511, "total_data_size": 9304504, "memory_usage": 9894240, "flush_reason": "Manual Compaction"} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436186718601, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5623997, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9842, "largest_seqno": 13030, "table_properties": {"data_size": 5611070, "index_size": 7704, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4165, "raw_key_size": 35973, "raw_average_key_size": 21, "raw_value_size": 5580643, "raw_average_value_size": 3355, "num_data_blocks": 332, "num_entries": 1663, "num_filter_entries": 1663, "num_deletions": 510, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436125, "oldest_key_time": 1760436125, "file_creation_time": 1760436186, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 33030 microseconds, and 12952 cpu microseconds. Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.718726) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5623997 bytes OK Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.718763) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.721109) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.721134) EVENT_LOG_v1 {"time_micros": 1760436186721127, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.721158) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 9288194, prev total WAL file size 9288194, number of live WAL files 2. Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.723291) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5492KB)], [15(9320KB)] Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436186723342, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 15168384, "oldest_snapshot_seqno": -1} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9795 keys, 13104519 bytes, temperature: kUnknown Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436186796789, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 13104519, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13046308, "index_size": 32638, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24517, "raw_key_size": 261710, "raw_average_key_size": 26, "raw_value_size": 12876263, "raw_average_value_size": 1314, "num_data_blocks": 1250, "num_entries": 9795, "num_filter_entries": 9795, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436186, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.797144) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 13104519 bytes Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.801416) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.1 rd, 178.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.4, 9.1 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(5.0) write-amplify(2.3) OK, records in: 10864, records dropped: 1069 output_compression: NoCompression Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.801439) EVENT_LOG_v1 {"time_micros": 1760436186801429, "job": 6, "event": "compaction_finished", "compaction_time_micros": 73601, "compaction_time_cpu_micros": 41304, "output_level": 6, "num_output_files": 1, "total_output_size": 13104519, "num_input_records": 10864, "num_output_records": 9795, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436186802365, "job": 6, "event": "table_file_deletion", "file_number": 17} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436186803384, "job": 6, "event": "table_file_deletion", "file_number": 15} Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.723234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.803464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.803471) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.803474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.803477) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:06 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:06.803480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:06 localhost nova_compute[297686]: 2025-10-14 10:03:06.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:07 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:03:07 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:03:07 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:07 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:07 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:08 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b96f20 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Oct 14 06:03:08 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:03:08 localhost ceph-mon[303906]: paxos.2).electionLogic(34) init, last seen epoch 34 Oct 14 06:03:08 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:08 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:08 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:08 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:08 localhost openstack_network_exporter[250374]: ERROR 10:03:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:03:08 localhost openstack_network_exporter[250374]: ERROR 10:03:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:03:08 localhost openstack_network_exporter[250374]: ERROR 10:03:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:03:08 localhost openstack_network_exporter[250374]: ERROR 10:03:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:03:08 localhost openstack_network_exporter[250374]: Oct 14 06:03:08 localhost openstack_network_exporter[250374]: ERROR 10:03:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:03:08 localhost openstack_network_exporter[250374]: Oct 14 06:03:09 localhost nova_compute[297686]: 2025-10-14 10:03:09.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:09 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:03:09 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:03:09 localhost ceph-mon[303906]: Remove daemons mon.np0005486731 Oct 14 06:03:09 localhost ceph-mon[303906]: Safe to remove mon.np0005486731: new quorum should be ['np0005486730', 'np0005486729', 'np0005486733', 'np0005486732'] (from ['np0005486730', 'np0005486729', 'np0005486733', 'np0005486732']) Oct 14 06:03:09 localhost ceph-mon[303906]: Removing monitor np0005486731 from monmap... Oct 14 06:03:09 localhost ceph-mon[303906]: Removing daemon mon.np0005486731 from np0005486731.localdomain -- ports [] Oct 14 06:03:09 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:03:09 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:03:09 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:03:09 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:03:09 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486729,np0005486733,np0005486732 in quorum (ranks 0,1,2,3) Oct 14 06:03:09 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:03:09 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:09 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:09 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:10 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:03:10 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:03:10 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:10 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:10 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:03:11 localhost nova_compute[297686]: 2025-10-14 10:03:11.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:11 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:03:11 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:03:11 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:11 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:11 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:03:12 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:03:12 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:03:12 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:12 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:12 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:03:12 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:12 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:03:12 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:12 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:12 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:03:12 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:12 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:03:13 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:03:13 localhost podman[306932]: 2025-10-14 10:03:13.71099081 +0000 UTC m=+0.058236829 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid) Oct 14 06:03:13 localhost podman[306932]: 2025-10-14 10:03:13.717833689 +0000 UTC m=+0.065079718 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:03:13 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:03:14 localhost nova_compute[297686]: 2025-10-14 10:03:14.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:14 localhost nova_compute[297686]: 2025-10-14 10:03:14.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:14 localhost podman[307004]: Oct 14 06:03:14 localhost podman[307004]: 2025-10-14 10:03:14.410489454 +0000 UTC m=+0.047334606 container create ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_brown, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, version=7) Oct 14 06:03:14 localhost systemd[1]: Started libpod-conmon-ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf.scope. Oct 14 06:03:14 localhost systemd[1]: Started libcrun container. Oct 14 06:03:14 localhost podman[307004]: 2025-10-14 10:03:14.4830561 +0000 UTC m=+0.119901252 container init ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_brown, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, RELEASE=main, release=553) Oct 14 06:03:14 localhost podman[307004]: 2025-10-14 10:03:14.389289547 +0000 UTC m=+0.026134729 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:14 localhost podman[307004]: 2025-10-14 10:03:14.490846617 +0000 UTC m=+0.127691789 container start ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_brown, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:03:14 localhost podman[307004]: 2025-10-14 10:03:14.491126756 +0000 UTC m=+0.127971908 container attach ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_brown, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:03:14 localhost xenodochial_brown[307019]: 167 167 Oct 14 06:03:14 localhost systemd[1]: libpod-ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf.scope: Deactivated successfully. Oct 14 06:03:14 localhost podman[307004]: 2025-10-14 10:03:14.496333864 +0000 UTC m=+0.133179086 container died ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_brown, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553) Oct 14 06:03:14 localhost podman[307024]: 2025-10-14 10:03:14.581564417 +0000 UTC m=+0.075556568 container remove ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_brown, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, GIT_CLEAN=True) Oct 14 06:03:14 localhost systemd[1]: libpod-conmon-ddaebc146a8e2d72f8a57db53314231420f4c8f91ffec3fb53a37ad141d827cf.scope: Deactivated successfully. Oct 14 06:03:14 localhost systemd[1]: var-lib-containers-storage-overlay-696ae045bb76549c46f0f4d5e0e544c1727c98591c09b458846b6a76caf43653-merged.mount: Deactivated successfully. Oct 14 06:03:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:03:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:03:14 localhost podman[307058]: 2025-10-14 10:03:14.875273023 +0000 UTC m=+0.097636382 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:03:14 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:14 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:14 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:14 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:14 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:14 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:03:14 localhost podman[307058]: 2025-10-14 10:03:14.920610117 +0000 UTC m=+0.142973506 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Oct 14 06:03:14 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:03:14 localhost podman[307091]: 2025-10-14 10:03:14.971464049 +0000 UTC m=+0.095971310 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:03:14 localhost podman[307091]: 2025-10-14 10:03:14.989922133 +0000 UTC m=+0.114429374 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:03:15 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:03:15 localhost podman[307134]: Oct 14 06:03:15 localhost podman[307134]: 2025-10-14 10:03:15.332513631 +0000 UTC m=+0.067474430 container create 944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mayer, io.buildah.version=1.33.12, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container) Oct 14 06:03:15 localhost systemd[1]: Started libpod-conmon-944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6.scope. Oct 14 06:03:15 localhost systemd[1]: Started libcrun container. Oct 14 06:03:15 localhost podman[307134]: 2025-10-14 10:03:15.298993908 +0000 UTC m=+0.033954737 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:15 localhost podman[307134]: 2025-10-14 10:03:15.405964613 +0000 UTC m=+0.140925402 container init 944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mayer, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:03:15 localhost podman[307134]: 2025-10-14 10:03:15.417461364 +0000 UTC m=+0.152422143 container start 944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mayer, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.expose-services=, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Oct 14 06:03:15 localhost podman[307134]: 2025-10-14 10:03:15.417746093 +0000 UTC m=+0.152706892 container attach 944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mayer, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:03:15 localhost competent_mayer[307149]: 167 167 Oct 14 06:03:15 localhost systemd[1]: libpod-944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6.scope: Deactivated successfully. Oct 14 06:03:15 localhost podman[307134]: 2025-10-14 10:03:15.422052495 +0000 UTC m=+0.157013284 container died 944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mayer, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, distribution-scope=public, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True) Oct 14 06:03:15 localhost podman[307154]: 2025-10-14 10:03:15.517664633 +0000 UTC m=+0.085869882 container remove 944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mayer, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git) Oct 14 06:03:15 localhost systemd[1]: libpod-conmon-944f5b9d72ebf58c366613d65b93ebb95dbdf27af751cad491bc2831edbc36a6.scope: Deactivated successfully. Oct 14 06:03:15 localhost systemd[1]: var-lib-containers-storage-overlay-766774f37eeb3833352cab463ce00134409201777ca47d7ca9c7bd3fd250a189-merged.mount: Deactivated successfully. Oct 14 06:03:15 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:03:15 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:03:15 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:03:15 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:03:15 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:15 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:15 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:03:16 localhost nova_compute[297686]: 2025-10-14 10:03:16.250 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:16 localhost nova_compute[297686]: 2025-10-14 10:03:16.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:16 localhost podman[307231]: Oct 14 06:03:16 localhost podman[307231]: 2025-10-14 10:03:16.361796372 +0000 UTC m=+0.077297570 container create 670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_bhabha, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:03:16 localhost systemd[1]: Started libpod-conmon-670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803.scope. Oct 14 06:03:16 localhost systemd[1]: Started libcrun container. Oct 14 06:03:16 localhost podman[307231]: 2025-10-14 10:03:16.429788349 +0000 UTC m=+0.145289537 container init 670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_bhabha, name=rhceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:03:16 localhost podman[307231]: 2025-10-14 10:03:16.331919361 +0000 UTC m=+0.047420609 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:16 localhost podman[307231]: 2025-10-14 10:03:16.480284 +0000 UTC m=+0.195785168 container start 670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_bhabha, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Oct 14 06:03:16 localhost podman[307231]: 2025-10-14 10:03:16.48059851 +0000 UTC m=+0.196099748 container attach 670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_bhabha, architecture=x86_64, version=7, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph) Oct 14 06:03:16 localhost keen_bhabha[307246]: 167 167 Oct 14 06:03:16 localhost systemd[1]: libpod-670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803.scope: Deactivated successfully. Oct 14 06:03:16 localhost podman[307231]: 2025-10-14 10:03:16.484100637 +0000 UTC m=+0.199601825 container died 670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_bhabha, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, io.buildah.version=1.33.12) Oct 14 06:03:16 localhost podman[307251]: 2025-10-14 10:03:16.563755128 +0000 UTC m=+0.072100722 container remove 670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_bhabha, distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:03:16 localhost systemd[1]: libpod-conmon-670bf4f5241b57decfddc94cb6c00832b9af2f62d5f4d0798db2ac586698e803.scope: Deactivated successfully. Oct 14 06:03:16 localhost systemd[1]: tmp-crun.nYaYBq.mount: Deactivated successfully. Oct 14 06:03:16 localhost systemd[1]: var-lib-containers-storage-overlay-b527ebb84528621d325ae3b31b8426475f2c0c3dff9e1da1bc8951645ade1e8d-merged.mount: Deactivated successfully. Oct 14 06:03:16 localhost nova_compute[297686]: 2025-10-14 10:03:16.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:16 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:03:16 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:03:16 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:16 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:16 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:03:16 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:16 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:03:17 localhost podman[307328]: Oct 14 06:03:17 localhost podman[307328]: 2025-10-14 10:03:17.407568738 +0000 UTC m=+0.078050444 container create 6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kirch, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=553, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Oct 14 06:03:17 localhost systemd[1]: Started libpod-conmon-6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a.scope. Oct 14 06:03:17 localhost systemd[1]: Started libcrun container. Oct 14 06:03:17 localhost podman[307328]: 2025-10-14 10:03:17.372307441 +0000 UTC m=+0.042789177 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:17 localhost podman[307328]: 2025-10-14 10:03:17.476589175 +0000 UTC m=+0.147070881 container init 6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kirch, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=553, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:03:17 localhost podman[307328]: 2025-10-14 10:03:17.487078315 +0000 UTC m=+0.157559981 container start 6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kirch, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, release=553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64) Oct 14 06:03:17 localhost eager_kirch[307342]: 167 167 Oct 14 06:03:17 localhost systemd[1]: libpod-6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a.scope: Deactivated successfully. Oct 14 06:03:17 localhost podman[307328]: 2025-10-14 10:03:17.48756593 +0000 UTC m=+0.158047696 container attach 6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kirch, release=553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Oct 14 06:03:17 localhost podman[307328]: 2025-10-14 10:03:17.490638943 +0000 UTC m=+0.161120619 container died 6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kirch, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7) Oct 14 06:03:17 localhost podman[307347]: 2025-10-14 10:03:17.587620824 +0000 UTC m=+0.084663216 container remove 6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kirch, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:03:17 localhost systemd[1]: libpod-conmon-6d77f9d700c9da21bc803e3aa7860b8cbb2d66008c56bbdc07ebcb149a1af81a.scope: Deactivated successfully. Oct 14 06:03:17 localhost systemd[1]: var-lib-containers-storage-overlay-495a80eb4373f8736b7b02109ddb20e04e498655b56a21f5c6dfa1dc18f48a43-merged.mount: Deactivated successfully. Oct 14 06:03:17 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:17 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:17 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:03:17 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:17 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:18 localhost podman[307416]: Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.278 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.279 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.279 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.279 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.280 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:03:18 localhost podman[307416]: 2025-10-14 10:03:18.281664642 +0000 UTC m=+0.077460256 container create c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_lewin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12) Oct 14 06:03:18 localhost systemd[1]: Started libpod-conmon-c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122.scope. Oct 14 06:03:18 localhost systemd[1]: Started libcrun container. Oct 14 06:03:18 localhost podman[307416]: 2025-10-14 10:03:18.249438568 +0000 UTC m=+0.045234212 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:18 localhost podman[307416]: 2025-10-14 10:03:18.35336378 +0000 UTC m=+0.149159394 container init c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_lewin, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , release=553, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Oct 14 06:03:18 localhost inspiring_lewin[307432]: 167 167 Oct 14 06:03:18 localhost systemd[1]: libpod-c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122.scope: Deactivated successfully. Oct 14 06:03:18 localhost podman[307416]: 2025-10-14 10:03:18.36515918 +0000 UTC m=+0.160954784 container start c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_lewin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, name=rhceph) Oct 14 06:03:18 localhost podman[307416]: 2025-10-14 10:03:18.36545139 +0000 UTC m=+0.161247034 container attach c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_lewin, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:03:18 localhost podman[307416]: 2025-10-14 10:03:18.367410039 +0000 UTC m=+0.163205673 container died c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_lewin, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:03:18 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:18 localhost podman[307439]: 2025-10-14 10:03:18.459353676 +0000 UTC m=+0.080131377 container remove c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_lewin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, maintainer=Guillaume Abrioux , ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:03:18 localhost systemd[1]: libpod-conmon-c38def4bb252766a0a155fdbbe32392dca6c2f29e2b0837ff81b36362afb8122.scope: Deactivated successfully. Oct 14 06:03:18 localhost systemd[1]: tmp-crun.dvsZ5A.mount: Deactivated successfully. Oct 14 06:03:18 localhost systemd[1]: var-lib-containers-storage-overlay-93d59727d8b9add1bbc527328bae9657516235aa7a3ee7906aeabd3bd875d767-merged.mount: Deactivated successfully. Oct 14 06:03:18 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:03:18 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4086457277' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.759 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.839 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:03:18 localhost nova_compute[297686]: 2025-10-14 10:03:18.839 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.024 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.026 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11493MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.026 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.026 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.116 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.117 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.117 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.181 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:19 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:19 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.654 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.659 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.673 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.675 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:03:19 localhost nova_compute[297686]: 2025-10-14 10:03:19.675 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:03:20 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:20 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:20 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:21 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:21 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:21 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:21 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:21 localhost nova_compute[297686]: 2025-10-14 10:03:21.675 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:21 localhost nova_compute[297686]: 2025-10-14 10:03:21.676 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:03:21 localhost nova_compute[297686]: 2025-10-14 10:03:21.676 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:03:21 localhost nova_compute[297686]: 2025-10-14 10:03:21.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:03:22 localhost podman[307833]: 2025-10-14 10:03:22.125861056 +0000 UTC m=+0.068304487 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41) Oct 14 06:03:22 localhost podman[307833]: 2025-10-14 10:03:22.143010359 +0000 UTC m=+0.085453760 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal) Oct 14 06:03:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:03:22 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:03:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:03:22 localhost podman[307855]: 2025-10-14 10:03:22.250428758 +0000 UTC m=+0.079961561 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.296 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.296 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.297 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:03:22 localhost systemd[1]: tmp-crun.6EBwYr.mount: Deactivated successfully. Oct 14 06:03:22 localhost podman[307856]: 2025-10-14 10:03:22.330264096 +0000 UTC m=+0.156195599 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute) Oct 14 06:03:22 localhost podman[307856]: 2025-10-14 10:03:22.347262415 +0000 UTC m=+0.173193928 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:03:22 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:03:22 localhost podman[307855]: 2025-10-14 10:03:22.398607922 +0000 UTC m=+0.228140685 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:03:22 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:03:22 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:22 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:22 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:22 localhost ceph-mon[303906]: Deploying daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:22 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.647 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.662 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.663 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.664 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:22 localhost nova_compute[297686]: 2025-10-14 10:03:22.664 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:03:23 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:23 localhost ceph-mon[303906]: Reconfiguring crash.np0005486729 (monmap changed)... Oct 14 06:03:23 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486729 on np0005486729.localdomain Oct 14 06:03:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:23 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:24 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Oct 14 06:03:24 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Oct 14 06:03:24 localhost nova_compute[297686]: 2025-10-14 10:03:24.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:24 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Oct 14 06:03:24 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b971e0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Oct 14 06:03:24 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:03:24 localhost ceph-mon[303906]: paxos.2).electionLogic(36) init, last seen epoch 36 Oct 14 06:03:24 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:24 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:26 localhost nova_compute[297686]: 2025-10-14 10:03:26.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:28 localhost podman[248187]: time="2025-10-14T10:03:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:03:28 localhost podman[248187]: @ - - [14/Oct/2025:10:03:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:03:28 localhost podman[248187]: @ - - [14/Oct/2025:10:03:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19834 "" "Go-http-client/1.1" Oct 14 06:03:29 localhost nova_compute[297686]: 2025-10-14 10:03:29.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:29 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:03:29 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486729,np0005486733,np0005486732 in quorum (ranks 0,1,2,3) Oct 14 06:03:29 localhost ceph-mon[303906]: Health check failed: 1/5 mons down, quorum np0005486730,np0005486729,np0005486733,np0005486732 (MON_DOWN) Oct 14 06:03:29 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005486730,np0005486729,np0005486733,np0005486732 Oct 14 06:03:29 localhost ceph-mon[303906]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005486730,np0005486729,np0005486733,np0005486732 Oct 14 06:03:29 localhost ceph-mon[303906]: mon.np0005486731 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Oct 14 06:03:29 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:29 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:29 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:30 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:03:30 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:03:30 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:30 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:30 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:31 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:03:31 localhost ceph-mon[303906]: paxos.2).electionLogic(38) init, last seen epoch 38 Oct 14 06:03:31 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:31 localhost ceph-mon[303906]: mon.np0005486733@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:31 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:03:31 localhost nova_compute[297686]: 2025-10-14 10:03:31.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:03:32 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:03:32 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486729 calling monitor election Oct 14 06:03:32 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486729,np0005486733,np0005486732,np0005486731 in quorum (ranks 0,1,2,3,4) Oct 14 06:03:32 localhost ceph-mon[303906]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005486730,np0005486729,np0005486733,np0005486732) Oct 14 06:03:32 localhost ceph-mon[303906]: Cluster is now healthy Oct 14 06:03:32 localhost ceph-mon[303906]: overall HEALTH_OK Oct 14 06:03:32 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:32 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:32 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:03:33 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:33 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:03:33 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:03:34 localhost nova_compute[297686]: 2025-10-14 10:03:34.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:34 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:03:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:34 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:03:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:34 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:35 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:03:35 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:03:35 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:35 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:35 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:36 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:03:36 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:03:36 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:36 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:36 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:03:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:03:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:03:36 localhost podman[307899]: 2025-10-14 10:03:36.752312227 +0000 UTC m=+0.087801341 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:03:36 localhost podman[307899]: 2025-10-14 10:03:36.764075536 +0000 UTC m=+0.099564650 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:03:36 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:03:36 localhost systemd[1]: tmp-crun.Bz13ax.mount: Deactivated successfully. Oct 14 06:03:36 localhost podman[307900]: 2025-10-14 10:03:36.826915365 +0000 UTC m=+0.158667125 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:03:36 localhost podman[307900]: 2025-10-14 10:03:36.832280899 +0000 UTC m=+0.164032699 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:03:36 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:03:36 localhost nova_compute[297686]: 2025-10-14 10:03:36.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:37 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:03:37 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:37 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e81 e81: 6 total, 6 up, 6 in Oct 14 06:03:38 localhost ceph-mon[303906]: Reconfig service osd.default_drive_group Oct 14 06:03:38 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:03:38 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:03:38 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:38 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:38 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:38 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' Oct 14 06:03:38 localhost ceph-mon[303906]: from='mgr.14184 172.18.0.105:0/819734915' entity='mgr.np0005486730.ddfidc' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: Activating manager daemon np0005486731.swasqz Oct 14 06:03:38 localhost ceph-mon[303906]: from='client.? 172.18.0.200:0/3558168517' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486729"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon metadata", "id": "np0005486729"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486730"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon metadata", "id": "np0005486730"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486731"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon metadata", "id": "np0005486731"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486732"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon metadata", "id": "np0005486732"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486733"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon metadata", "id": "np0005486733"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005486733.tvstmf"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mds metadata", "who": "mds.np0005486733.tvstmf"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon).mds e16 all = 0 Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005486731.onyaog"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mds metadata", "who": "mds.np0005486731.onyaog"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon).mds e16 all = 0 Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005486732.xkownj"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mds metadata", "who": "mds.np0005486732.xkownj"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon).mds e16 all = 0 Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486731.swasqz", "id": "np0005486731.swasqz"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr metadata", "who": "np0005486731.swasqz", "id": "np0005486731.swasqz"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486732.pasqzz", "id": "np0005486732.pasqzz"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr metadata", "who": "np0005486732.pasqzz", "id": "np0005486732.pasqzz"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486733.primvu", "id": "np0005486733.primvu"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr metadata", "who": "np0005486733.primvu", "id": "np0005486733.primvu"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486728.giajub", "id": "np0005486728.giajub"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr metadata", "who": "np0005486728.giajub", "id": "np0005486728.giajub"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486729.xpybho", "id": "np0005486729.xpybho"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr metadata", "who": "np0005486729.xpybho", "id": "np0005486729.xpybho"} : dispatch Oct 14 06:03:38 localhost systemd[1]: session-67.scope: Deactivated successfully. Oct 14 06:03:38 localhost systemd[1]: session-67.scope: Consumed 16.422s CPU time. Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata", "id": 0} : dispatch Oct 14 06:03:38 localhost systemd-logind[760]: Session 67 logged out. Waiting for processes to exit. Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata", "id": 1} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata", "id": 2} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata", "id": 3} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata", "id": 4} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata", "id": 5} : dispatch Oct 14 06:03:38 localhost systemd-logind[760]: Removed session 67. Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mds metadata"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mds metadata"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon).mds e16 all = 1 Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd metadata"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd metadata"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon metadata"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon metadata"} : dispatch Oct 14 06:03:38 localhost openstack_network_exporter[250374]: ERROR 10:03:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:03:38 localhost openstack_network_exporter[250374]: ERROR 10:03:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:03:38 localhost openstack_network_exporter[250374]: ERROR 10:03:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} : dispatch Oct 14 06:03:38 localhost openstack_network_exporter[250374]: ERROR 10:03:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:03:38 localhost openstack_network_exporter[250374]: Oct 14 06:03:38 localhost openstack_network_exporter[250374]: ERROR 10:03:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:03:38 localhost openstack_network_exporter[250374]: Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/mirror_snapshot_schedule"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/mirror_snapshot_schedule"} : dispatch Oct 14 06:03:38 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/trash_purge_schedule"} v 0) Oct 14 06:03:38 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/trash_purge_schedule"} : dispatch Oct 14 06:03:39 localhost sshd[307940]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:03:39 localhost systemd-logind[760]: New session 68 of user ceph-admin. Oct 14 06:03:39 localhost systemd[1]: Started Session 68 of User ceph-admin. Oct 14 06:03:39 localhost nova_compute[297686]: 2025-10-14 10:03:39.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:39 localhost ceph-mon[303906]: Manager daemon np0005486731.swasqz is now available Oct 14 06:03:39 localhost ceph-mon[303906]: removing stray HostCache host record np0005486728.localdomain.devices.0 Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"}]': finished Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486728.localdomain.devices.0"}]': finished Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/mirror_snapshot_schedule"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/mirror_snapshot_schedule"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/trash_purge_schedule"} : dispatch Oct 14 06:03:39 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/trash_purge_schedule"} : dispatch Oct 14 06:03:40 localhost podman[308048]: 2025-10-14 10:03:40.281028861 +0000 UTC m=+0.097860838 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , name=rhceph, io.buildah.version=1.33.12, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Oct 14 06:03:40 localhost podman[308048]: 2025-10-14 10:03:40.410963498 +0000 UTC m=+0.227795475 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container) Oct 14 06:03:40 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain.devices.0}] v 0) Oct 14 06:03:40 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain}] v 0) Oct 14 06:03:40 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:03:40 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:03:41 localhost ceph-mon[303906]: [14/Oct/2025:10:03:40] ENGINE Bus STARTING Oct 14 06:03:41 localhost ceph-mon[303906]: [14/Oct/2025:10:03:40] ENGINE Serving on https://172.18.0.106:7150 Oct 14 06:03:41 localhost ceph-mon[303906]: [14/Oct/2025:10:03:40] ENGINE Client ('172.18.0.106', 58016) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:03:41 localhost ceph-mon[303906]: [14/Oct/2025:10:03:40] ENGINE Serving on http://172.18.0.106:8765 Oct 14 06:03:41 localhost ceph-mon[303906]: [14/Oct/2025:10:03:40] ENGINE Bus STARTED Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:41 localhost nova_compute[297686]: 2025-10-14 10:03:41.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain.devices.0}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005486729", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd/host:np0005486729", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:42 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Oct 14 06:03:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd/host:np0005486729", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd/host:np0005486729", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:03:43 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:03:43 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:03:43 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:03:43 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:43 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:43 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:43 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:03:43 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:03:43 localhost systemd[1]: tmp-crun.oEih6b.mount: Deactivated successfully. Oct 14 06:03:43 localhost podman[308500]: 2025-10-14 10:03:43.891330095 +0000 UTC m=+0.091294988 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0) Oct 14 06:03:43 localhost podman[308500]: 2025-10-14 10:03:43.930076778 +0000 UTC m=+0.130041741 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible) Oct 14 06:03:43 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:03:44 localhost nova_compute[297686]: 2025-10-14 10:03:44.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:44 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:44 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:44 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:44 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:44 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:03:44 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486730.ddfidc", "id": "np0005486730.ddfidc"} v 0) Oct 14 06:03:44 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr metadata", "who": "np0005486730.ddfidc", "id": "np0005486730.ddfidc"} : dispatch Oct 14 06:03:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:03:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:03:45 localhost podman[308787]: 2025-10-14 10:03:45.081272481 +0000 UTC m=+0.086135201 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:03:45 localhost podman[308787]: 2025-10-14 10:03:45.097099324 +0000 UTC m=+0.101962034 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:03:45 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:03:45 localhost podman[308822]: 2025-10-14 10:03:45.176997593 +0000 UTC m=+0.087766350 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:03:45 localhost podman[308822]: 2025-10-14 10:03:45.192386733 +0000 UTC m=+0.103155520 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:03:45 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain.devices.0}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Oct 14 06:03:45 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Oct 14 06:03:46 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:03:46 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:46 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:46 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:46 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:03:46 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:03:46 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:46 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:46 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:46 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:46 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486729.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:47 localhost nova_compute[297686]: 2025-10-14 10:03:47.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:47 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain.devices.0}] v 0) Oct 14 06:03:47 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain}] v 0) Oct 14 06:03:47 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:03:47 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:47 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr services"} v 0) Oct 14 06:03:47 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr services"} : dispatch Oct 14 06:03:47 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:47 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:47 localhost ceph-mon[303906]: Reconfiguring crash.np0005486729 (monmap changed)... Oct 14 06:03:47 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486729 on np0005486729.localdomain Oct 14 06:03:47 localhost ceph-mon[303906]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Oct 14 06:03:47 localhost ceph-mon[303906]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Oct 14 06:03:47 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:47 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:47 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:47 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486729.xpybho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain.devices.0}] v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain}] v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr services"} v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr services"} : dispatch Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:48 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486729.xpybho (monmap changed)... Oct 14 06:03:48 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486729.xpybho on np0005486729.localdomain Oct 14 06:03:48 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:48 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:48 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:03:48 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:03:49 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Oct 14 06:03:49 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:03:49 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:49 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:49 localhost nova_compute[297686]: 2025-10-14 10:03:49.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:49 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:03:49 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:03:49 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:49 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:49 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:49 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:03:50 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:50 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:50 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:50 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:50 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Oct 14 06:03:50 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:03:50 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:50 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.593654) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436230593786, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2320, "num_deletes": 255, "total_data_size": 8361843, "memory_usage": 9290496, "flush_reason": "Manual Compaction"} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436230621248, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 4772917, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13035, "largest_seqno": 15350, "table_properties": {"data_size": 4763839, "index_size": 5327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 23831, "raw_average_key_size": 22, "raw_value_size": 4743703, "raw_average_value_size": 4458, "num_data_blocks": 222, "num_entries": 1064, "num_filter_entries": 1064, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436187, "oldest_key_time": 1760436187, "file_creation_time": 1760436230, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 27607 microseconds, and 9974 cpu microseconds. Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.621309) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 4772917 bytes OK Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.621335) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.623542) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.623563) EVENT_LOG_v1 {"time_micros": 1760436230623558, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.623584) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8350328, prev total WAL file size 8350328, number of live WAL files 2. Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.624992) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303130' seq:72057594037927935, type:22 .. '6B760031323633' seq:0, type:0; will stop at (end) Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4661KB)], [18(12MB)] Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436230625033, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17877436, "oldest_snapshot_seqno": -1} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10384 keys, 17090419 bytes, temperature: kUnknown Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436230724818, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17090419, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17027210, "index_size": 36164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25989, "raw_key_size": 277306, "raw_average_key_size": 26, "raw_value_size": 16845709, "raw_average_value_size": 1622, "num_data_blocks": 1384, "num_entries": 10384, "num_filter_entries": 10384, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436230, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.725165) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17090419 bytes Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.728231) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.0 rd, 171.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.6, 12.5 +0.0 blob) out(16.3 +0.0 blob), read-write-amplify(7.3) write-amplify(3.6) OK, records in: 10859, records dropped: 475 output_compression: NoCompression Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.728252) EVENT_LOG_v1 {"time_micros": 1760436230728243, "job": 8, "event": "compaction_finished", "compaction_time_micros": 99866, "compaction_time_cpu_micros": 28782, "output_level": 6, "num_output_files": 1, "total_output_size": 17090419, "num_input_records": 10859, "num_output_records": 10384, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436230728829, "job": 8, "event": "table_file_deletion", "file_number": 20} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436230730174, "job": 8, "event": "table_file_deletion", "file_number": 18} Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.624911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.730279) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.730287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.730290) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.730293) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:50 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:03:50.730296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:03:50 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:03:50 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:50 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:50 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:50 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:50 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:03:51 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:51 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:51 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:03:51 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:03:51 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:03:51 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:51 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:51 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:51 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:03:51 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:51 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:51 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:51 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:51 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:51 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:52 localhost nova_compute[297686]: 2025-10-14 10:03:52.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:52 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:03:52 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:03:52 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:03:52 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:52 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr services"} v 0) Oct 14 06:03:52 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr services"} : dispatch Oct 14 06:03:52 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:52 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:52 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:03:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:03:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:03:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:03:52 localhost podman[309007]: 2025-10-14 10:03:52.764079139 +0000 UTC m=+0.094726423 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.schema-version=1.0) Oct 14 06:03:52 localhost systemd[1]: tmp-crun.GLSHnZ.mount: Deactivated successfully. Oct 14 06:03:52 localhost podman[309009]: 2025-10-14 10:03:52.819935284 +0000 UTC m=+0.142184581 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:03:52 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:03:52 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:03:52 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:52 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:52 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:52 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:52 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:52 localhost podman[309007]: 2025-10-14 10:03:52.824347459 +0000 UTC m=+0.154994773 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Oct 14 06:03:52 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:03:52 localhost podman[309009]: 2025-10-14 10:03:52.839329507 +0000 UTC m=+0.161578774 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Oct 14 06:03:52 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:03:52 localhost podman[309008]: 2025-10-14 10:03:52.736740044 +0000 UTC m=+0.067893063 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:03:52 localhost podman[309008]: 2025-10-14 10:03:52.917734039 +0000 UTC m=+0.248887118 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7) Oct 14 06:03:52 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:03:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:03:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:03:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:03:53 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:03:53 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:03:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:53 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:53 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:53 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:03:53 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:03:53 localhost ceph-mon[303906]: Saving service mon spec with placement label:mon Oct 14 06:03:53 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:53 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:53 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:54 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:03:54 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:03:54 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:03:54 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:54 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:54 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:54 localhost nova_compute[297686]: 2025-10-14 10:03:54.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:54 localhost ceph-mon[303906]: Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:03:54 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:03:54 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:54 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:54 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:54 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:03:54 localhost podman[309123]: Oct 14 06:03:54 localhost podman[309123]: 2025-10-14 10:03:54.896718883 +0000 UTC m=+0.084904963 container create 6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jemison, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, version=7, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main) Oct 14 06:03:54 localhost systemd[1]: Started libpod-conmon-6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e.scope. Oct 14 06:03:54 localhost podman[309123]: 2025-10-14 10:03:54.86454703 +0000 UTC m=+0.052733140 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:54 localhost systemd[1]: Started libcrun container. Oct 14 06:03:54 localhost podman[309123]: 2025-10-14 10:03:54.986097642 +0000 UTC m=+0.174283732 container init 6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jemison, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2025-09-24T08:57:55) Oct 14 06:03:54 localhost podman[309123]: 2025-10-14 10:03:54.999699207 +0000 UTC m=+0.187885297 container start 6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jemison, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main) Oct 14 06:03:55 localhost podman[309123]: 2025-10-14 10:03:55.000156301 +0000 UTC m=+0.188342401 container attach 6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jemison, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, version=7, ceph=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc.) Oct 14 06:03:55 localhost systemd[1]: libpod-6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e.scope: Deactivated successfully. Oct 14 06:03:55 localhost quirky_jemison[309138]: 167 167 Oct 14 06:03:55 localhost podman[309123]: 2025-10-14 10:03:55.006877376 +0000 UTC m=+0.195063506 container died 6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jemison, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, release=553, com.redhat.component=rhceph-container) Oct 14 06:03:55 localhost podman[309144]: 2025-10-14 10:03:55.104702842 +0000 UTC m=+0.089778212 container remove 6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jemison, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main) Oct 14 06:03:55 localhost systemd[1]: libpod-conmon-6a875c4ac506ab113247078c1538068c32a43b95da3c27247114c96a52df694e.scope: Deactivated successfully. Oct 14 06:03:55 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:55 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:55 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Oct 14 06:03:55 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:03:55 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:55 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:55 localhost podman[309213]: Oct 14 06:03:55 localhost podman[309213]: 2025-10-14 10:03:55.831464259 +0000 UTC m=+0.074546697 container create b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_beaver, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, release=553, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:03:55 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:03:55 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:03:55 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:55 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:55 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:03:55 localhost systemd[1]: Started libpod-conmon-b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6.scope. Oct 14 06:03:55 localhost systemd[1]: Started libcrun container. Oct 14 06:03:55 localhost podman[309213]: 2025-10-14 10:03:55.800276266 +0000 UTC m=+0.043358684 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:55 localhost systemd[1]: tmp-crun.WSsGcS.mount: Deactivated successfully. Oct 14 06:03:55 localhost systemd[1]: var-lib-containers-storage-overlay-a4bf7d61b8567db8d747d656469b0e18d5344cba8d65f6beeaf46303312a9142-merged.mount: Deactivated successfully. Oct 14 06:03:55 localhost podman[309213]: 2025-10-14 10:03:55.907475499 +0000 UTC m=+0.150557947 container init b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_beaver, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container) Oct 14 06:03:55 localhost podman[309213]: 2025-10-14 10:03:55.919176926 +0000 UTC m=+0.162259394 container start b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_beaver, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public) Oct 14 06:03:55 localhost podman[309213]: 2025-10-14 10:03:55.919660431 +0000 UTC m=+0.162742909 container attach b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_beaver, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:03:55 localhost nostalgic_beaver[309228]: 167 167 Oct 14 06:03:55 localhost systemd[1]: libpod-b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6.scope: Deactivated successfully. Oct 14 06:03:55 localhost podman[309213]: 2025-10-14 10:03:55.92388071 +0000 UTC m=+0.166963178 container died b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_beaver, ceph=True, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, io.buildah.version=1.33.12, vcs-type=git) Oct 14 06:03:56 localhost systemd[1]: var-lib-containers-storage-overlay-4a0fa50379174890c217d09e59002aa2a775df6f5ecf3f33035366468df8317a-merged.mount: Deactivated successfully. Oct 14 06:03:56 localhost podman[309233]: 2025-10-14 10:03:56.024344176 +0000 UTC m=+0.071458442 container remove b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_beaver, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git) Oct 14 06:03:56 localhost systemd[1]: libpod-conmon-b1cc6fcda1c65a25870055dbe3a39276a5004ea1b8492238d1a45e2bf1bd57a6.scope: Deactivated successfully. Oct 14 06:03:56 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:56 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:56 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:56 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:56 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Oct 14 06:03:56 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:03:56 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:56 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:56 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:03:56 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:03:56 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:56 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:56 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:56 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:56 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:03:56 localhost podman[309310]: Oct 14 06:03:56 localhost podman[309310]: 2025-10-14 10:03:56.901844855 +0000 UTC m=+0.067410259 container create 995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_matsumoto, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:03:56 localhost systemd[1]: Started libpod-conmon-995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed.scope. Oct 14 06:03:56 localhost systemd[1]: Started libcrun container. Oct 14 06:03:56 localhost podman[309310]: 2025-10-14 10:03:56.966941152 +0000 UTC m=+0.132506546 container init 995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_matsumoto, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=) Oct 14 06:03:56 localhost podman[309310]: 2025-10-14 10:03:56.871289172 +0000 UTC m=+0.036854596 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:56 localhost podman[309310]: 2025-10-14 10:03:56.979653219 +0000 UTC m=+0.145218613 container start 995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_matsumoto, com.redhat.component=rhceph-container, ceph=True, name=rhceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, vendor=Red Hat, Inc.) Oct 14 06:03:56 localhost podman[309310]: 2025-10-14 10:03:56.979995111 +0000 UTC m=+0.145560515 container attach 995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_matsumoto, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=553, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Oct 14 06:03:56 localhost mystifying_matsumoto[309326]: 167 167 Oct 14 06:03:56 localhost systemd[1]: libpod-995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed.scope: Deactivated successfully. Oct 14 06:03:56 localhost podman[309310]: 2025-10-14 10:03:56.983002252 +0000 UTC m=+0.148567676 container died 995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_matsumoto, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:03:57 localhost nova_compute[297686]: 2025-10-14 10:03:57.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:57 localhost podman[309331]: 2025-10-14 10:03:57.103248473 +0000 UTC m=+0.111672420 container remove 995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_matsumoto, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, version=7, RELEASE=main, io.buildah.version=1.33.12, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Oct 14 06:03:57 localhost systemd[1]: libpod-conmon-995a7263bb35de129d7731d82d7777e41f05d1e902559e379176dc96bdeaabed.scope: Deactivated successfully. Oct 14 06:03:57 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:57 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:57 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:57 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:57 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:03:57 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:57 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:57 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:03:57.773 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:03:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:03:57.773 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:03:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:03:57.775 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:03:57 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:03:57 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:03:57 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:57 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:57 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:57 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:57 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:57 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:03:57 localhost systemd[1]: var-lib-containers-storage-overlay-aa35d992af9339e831e6fc633994e2d55d20a6a2e473691e827b4e75518c6f19-merged.mount: Deactivated successfully. Oct 14 06:03:58 localhost podman[309407]: Oct 14 06:03:58 localhost podman[309407]: 2025-10-14 10:03:58.047494088 +0000 UTC m=+0.091955558 container create c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_jang, CEPH_POINT_RELEASE=, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, architecture=x86_64) Oct 14 06:03:58 localhost systemd[1]: Started libpod-conmon-c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1.scope. Oct 14 06:03:58 localhost systemd[1]: Started libcrun container. Oct 14 06:03:58 localhost podman[309407]: 2025-10-14 10:03:58.01086924 +0000 UTC m=+0.055330730 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:58 localhost podman[309407]: 2025-10-14 10:03:58.116609268 +0000 UTC m=+0.161070738 container init c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_jang, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64) Oct 14 06:03:58 localhost podman[309407]: 2025-10-14 10:03:58.126649625 +0000 UTC m=+0.171111095 container start c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_jang, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55) Oct 14 06:03:58 localhost podman[309407]: 2025-10-14 10:03:58.126998445 +0000 UTC m=+0.171459945 container attach c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_jang, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, release=553, architecture=x86_64) Oct 14 06:03:58 localhost gracious_jang[309422]: 167 167 Oct 14 06:03:58 localhost systemd[1]: libpod-c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1.scope: Deactivated successfully. Oct 14 06:03:58 localhost podman[309407]: 2025-10-14 10:03:58.130471731 +0000 UTC m=+0.174933191 container died c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_jang, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, vendor=Red Hat, Inc., ceph=True, name=rhceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Oct 14 06:03:58 localhost podman[309427]: 2025-10-14 10:03:58.223223012 +0000 UTC m=+0.083511149 container remove c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_jang, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, architecture=x86_64, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, name=rhceph, release=553, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_CLEAN=True, distribution-scope=public) Oct 14 06:03:58 localhost systemd[1]: libpod-conmon-c93b195a34fe3d9331875cd0afaf4bcd3da15498a51cdeeca8eb0e344fc56ec1.scope: Deactivated successfully. Oct 14 06:03:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:58 localhost podman[248187]: time="2025-10-14T10:03:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:03:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:58 localhost podman[248187]: @ - - [14/Oct/2025:10:03:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:03:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:03:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr services"} v 0) Oct 14 06:03:58 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mgr services"} : dispatch Oct 14 06:03:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:58 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:58 localhost podman[248187]: @ - - [14/Oct/2025:10:03:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19841 "" "Go-http-client/1.1" Oct 14 06:03:58 localhost ceph-mon[303906]: mon.np0005486733@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:03:58 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:03:58 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:03:58 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:58 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:58 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:58 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:03:58 localhost systemd[1]: var-lib-containers-storage-overlay-f7c350b932951bbc1bd35658e590ca63ad3b2178902382aacf5e31bb2fcee6f2-merged.mount: Deactivated successfully. Oct 14 06:03:58 localhost podman[309498]: Oct 14 06:03:58 localhost podman[309498]: 2025-10-14 10:03:58.958615913 +0000 UTC m=+0.076954221 container create 54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_hofstadter, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=553, vcs-type=git, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, ceph=True) Oct 14 06:03:58 localhost systemd[1]: Started libpod-conmon-54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7.scope. Oct 14 06:03:59 localhost systemd[1]: Started libcrun container. Oct 14 06:03:59 localhost podman[309498]: 2025-10-14 10:03:59.026739362 +0000 UTC m=+0.145077670 container init 54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_hofstadter, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:03:59 localhost podman[309498]: 2025-10-14 10:03:58.927629367 +0000 UTC m=+0.045967705 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:59 localhost podman[309498]: 2025-10-14 10:03:59.035698265 +0000 UTC m=+0.154036583 container start 54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_hofstadter, release=553, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, architecture=x86_64, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Oct 14 06:03:59 localhost podman[309498]: 2025-10-14 10:03:59.036178801 +0000 UTC m=+0.154517189 container attach 54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_hofstadter, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, version=7, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64) Oct 14 06:03:59 localhost keen_hofstadter[309513]: 167 167 Oct 14 06:03:59 localhost systemd[1]: libpod-54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7.scope: Deactivated successfully. Oct 14 06:03:59 localhost podman[309498]: 2025-10-14 10:03:59.040582415 +0000 UTC m=+0.158920723 container died 54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_hofstadter, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, version=7, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:03:59 localhost podman[309518]: 2025-10-14 10:03:59.149201721 +0000 UTC m=+0.100255092 container remove 54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_hofstadter, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:03:59 localhost systemd[1]: libpod-conmon-54780c119b0d5ff8685def9468f2fa83824186ef50e28e58ccbefa59b63367a7.scope: Deactivated successfully. Oct 14 06:03:59 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:03:59 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:03:59 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:03:59 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:59 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:03:59 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:03:59 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:03:59 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:03:59 localhost nova_compute[297686]: 2025-10-14 10:03:59.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:03:59 localhost podman[309588]: Oct 14 06:03:59 localhost podman[309588]: 2025-10-14 10:03:59.88848971 +0000 UTC m=+0.079712115 container create d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_banzai, version=7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, RELEASE=main, ceph=True, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:03:59 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:03:59 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:03:59 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:59 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:03:59 localhost ceph-mon[303906]: Reconfiguring mon.np0005486733 (monmap changed)... Oct 14 06:03:59 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:03:59 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:03:59 localhost systemd[1]: var-lib-containers-storage-overlay-2c8c284dbe5c08fe6f0e3ac90a2d9df747300e283109b8571c53c742bbcb240b-merged.mount: Deactivated successfully. Oct 14 06:03:59 localhost systemd[1]: Started libpod-conmon-d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76.scope. Oct 14 06:03:59 localhost systemd[1]: Started libcrun container. Oct 14 06:03:59 localhost podman[309588]: 2025-10-14 10:03:59.857851624 +0000 UTC m=+0.049074069 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:03:59 localhost podman[309588]: 2025-10-14 10:03:59.959721884 +0000 UTC m=+0.150944299 container init d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_banzai, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, maintainer=Guillaume Abrioux ) Oct 14 06:03:59 localhost systemd[1]: tmp-crun.DeNlgC.mount: Deactivated successfully. Oct 14 06:03:59 localhost quizzical_banzai[309603]: 167 167 Oct 14 06:03:59 localhost systemd[1]: libpod-d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76.scope: Deactivated successfully. Oct 14 06:03:59 localhost podman[309588]: 2025-10-14 10:03:59.971797993 +0000 UTC m=+0.163020408 container start d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_banzai, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.expose-services=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux ) Oct 14 06:03:59 localhost podman[309588]: 2025-10-14 10:03:59.973977149 +0000 UTC m=+0.165199574 container attach d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_banzai, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.expose-services=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git) Oct 14 06:03:59 localhost podman[309588]: 2025-10-14 10:03:59.97661487 +0000 UTC m=+0.167837275 container died d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_banzai, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , release=553, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:04:00 localhost podman[309608]: 2025-10-14 10:04:00.085484423 +0000 UTC m=+0.097182728 container remove d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_banzai, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64) Oct 14 06:04:00 localhost systemd[1]: libpod-conmon-d0b13591f304a92b9da07a250255e32f91ce44c1e8c5d05569de97cf9c1f2b76.scope: Deactivated successfully. Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1392979788' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:04:00 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:04:00 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:04:00 localhost systemd[1]: var-lib-containers-storage-overlay-4fec9264c6560e2babf7aa4f4377d2ddc8a4dd5b5f5e0667eacea46dc5087b4b-merged.mount: Deactivated successfully. Oct 14 06:04:01 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:01 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:01 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:04:01 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:01 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:01 localhost ceph-mon[303906]: Reconfiguring mon.np0005486729 (monmap changed)... Oct 14 06:04:01 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:01 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486729 on np0005486729.localdomain Oct 14 06:04:01 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain.devices.0}] v 0) Oct 14 06:04:01 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486729.localdomain}] v 0) Oct 14 06:04:01 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:04:01 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:01 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:04:01 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:04:01 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:04:01 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:04:02 localhost nova_compute[297686]: 2025-10-14 10:04:02.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:02 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:04:02 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:04:02 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:04:02 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:02 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:04:02 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:04:02 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:04:02 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:04:02 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:02 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:02 localhost ceph-mon[303906]: Reconfiguring mon.np0005486730 (monmap changed)... Oct 14 06:04:02 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:02 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486730 on np0005486730.localdomain Oct 14 06:04:02 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:02 localhost ceph-mon[303906]: from='mgr.17397 ' entity='mgr.np0005486731.swasqz' Oct 14 06:04:02 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:03 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "quorum_status"} v 0) Oct 14 06:04:03 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "quorum_status"} : dispatch Oct 14 06:04:03 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e9 handle_command mon_command({"prefix": "mon rm", "name": "np0005486729"} v 0) Oct 14 06:04:03 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon rm", "name": "np0005486729"} : dispatch Oct 14 06:04:03 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97080 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Oct 14 06:04:03 localhost ceph-mon[303906]: mon.np0005486733@2(peon) e10 my rank is now 1 (was 2) Oct 14 06:04:03 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:04:03 localhost ceph-mon[303906]: paxos.1).electionLogic(40) init, last seen epoch 40 Oct 14 06:04:03 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:03 localhost ceph-mgr[302471]: --2- 172.18.0.108:0/2728758967 >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x55da4e463400 0x55da4e439600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Oct 14 06:04:03 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Oct 14 06:04:03 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Oct 14 06:04:03 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da4e5d8000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Oct 14 06:04:04 localhost nova_compute[297686]: 2025-10-14 10:04:04.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.420963) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436245420998, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 820, "num_deletes": 251, "total_data_size": 1164679, "memory_usage": 1181152, "flush_reason": "Manual Compaction"} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436245427131, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 674093, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15355, "largest_seqno": 16170, "table_properties": {"data_size": 669845, "index_size": 1911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 11132, "raw_average_key_size": 22, "raw_value_size": 660837, "raw_average_value_size": 1306, "num_data_blocks": 77, "num_entries": 506, "num_filter_entries": 506, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436230, "oldest_key_time": 1760436230, "file_creation_time": 1760436245, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 6205 microseconds, and 1630 cpu microseconds. Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.427169) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 674093 bytes OK Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.427189) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.428709) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.428724) EVENT_LOG_v1 {"time_micros": 1760436245428719, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.428739) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1160128, prev total WAL file size 1160128, number of live WAL files 2. Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.429444) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(658KB)], [21(16MB)] Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436245429481, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 17764512, "oldest_snapshot_seqno": -1} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10357 keys, 15824802 bytes, temperature: kUnknown Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436245487798, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15824802, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15762870, "index_size": 34951, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25925, "raw_key_size": 277621, "raw_average_key_size": 26, "raw_value_size": 15582874, "raw_average_value_size": 1504, "num_data_blocks": 1330, "num_entries": 10357, "num_filter_entries": 10357, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436245, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.488033) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15824802 bytes Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.489337) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 304.1 rd, 270.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 16.3 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(49.8) write-amplify(23.5) OK, records in: 10890, records dropped: 533 output_compression: NoCompression Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.489354) EVENT_LOG_v1 {"time_micros": 1760436245489346, "job": 10, "event": "compaction_finished", "compaction_time_micros": 58413, "compaction_time_cpu_micros": 28463, "output_level": 6, "num_output_files": 1, "total_output_size": 15824802, "num_input_records": 10890, "num_output_records": 10357, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436245489491, "job": 10, "event": "table_file_deletion", "file_number": 23} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436245490875, "job": 10, "event": "table_file_deletion", "file_number": 21} Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.429379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.490919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.490925) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.490928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.490931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:05 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:05.490943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:05 localhost ceph-mon[303906]: Reconfiguring mon.np0005486731 (monmap changed)... Oct 14 06:04:05 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:04:05 localhost ceph-mon[303906]: Remove daemons mon.np0005486729 Oct 14 06:04:05 localhost ceph-mon[303906]: Safe to remove mon.np0005486729: new quorum should be ['np0005486730', 'np0005486733', 'np0005486732', 'np0005486731'] (from ['np0005486730', 'np0005486733', 'np0005486732', 'np0005486731']) Oct 14 06:04:05 localhost ceph-mon[303906]: Removing monitor np0005486729 from monmap... Oct 14 06:04:05 localhost ceph-mon[303906]: Removing daemon mon.np0005486729 from np0005486729.localdomain -- ports [] Oct 14 06:04:05 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "mon rm", "name": "np0005486729"} : dispatch Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:04:05 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486733,np0005486732,np0005486731 in quorum (ranks 0,1,2,3) Oct 14 06:04:05 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:04:05 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Oct 14 06:04:05 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:04:05 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:04:05 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:04:05 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:05 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:06 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:06 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:04:06 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:06 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:04:06 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:06 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:06 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:04:07 localhost nova_compute[297686]: 2025-10-14 10:04:07.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:07 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:04:07 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:04:07 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:07 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:07 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:04:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:04:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:04:07 localhost podman[309644]: 2025-10-14 10:04:07.750554821 +0000 UTC m=+0.084778819 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:04:07 localhost podman[309644]: 2025-10-14 10:04:07.764969181 +0000 UTC m=+0.099193179 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:04:07 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:04:07 localhost systemd[1]: tmp-crun.hONHa7.mount: Deactivated successfully. Oct 14 06:04:07 localhost podman[309645]: 2025-10-14 10:04:07.828006806 +0000 UTC m=+0.161699448 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 06:04:07 localhost podman[309645]: 2025-10-14 10:04:07.86318958 +0000 UTC m=+0.196882232 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Oct 14 06:04:07 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:04:08 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:08 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:04:08 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:04:08 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:08 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:08 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:08 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:04:08 localhost openstack_network_exporter[250374]: ERROR 10:04:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:04:08 localhost openstack_network_exporter[250374]: ERROR 10:04:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:04:08 localhost openstack_network_exporter[250374]: ERROR 10:04:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:04:08 localhost openstack_network_exporter[250374]: ERROR 10:04:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:04:08 localhost openstack_network_exporter[250374]: Oct 14 06:04:08 localhost openstack_network_exporter[250374]: ERROR 10:04:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:04:08 localhost openstack_network_exporter[250374]: Oct 14 06:04:09 localhost ceph-mon[303906]: Removed label mon from host np0005486729.localdomain Oct 14 06:04:09 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:04:09 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:04:09 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:09 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:09 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:09 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:09 localhost nova_compute[297686]: 2025-10-14 10:04:09.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:10 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:04:10 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:04:10 localhost ceph-mon[303906]: Removed label mgr from host np0005486729.localdomain Oct 14 06:04:10 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:10 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:10 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:11 localhost podman[309740]: Oct 14 06:04:11 localhost podman[309740]: 2025-10-14 10:04:11.550487175 +0000 UTC m=+0.080198710 container create d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mirzakhani, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=553, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:04:11 localhost systemd[1]: Started libpod-conmon-d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb.scope. Oct 14 06:04:11 localhost podman[309740]: 2025-10-14 10:04:11.516783586 +0000 UTC m=+0.046495161 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:11 localhost systemd[1]: Started libcrun container. Oct 14 06:04:11 localhost podman[309740]: 2025-10-14 10:04:11.639804452 +0000 UTC m=+0.169515997 container init d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mirzakhani, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, release=553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:04:11 localhost podman[309740]: 2025-10-14 10:04:11.65155379 +0000 UTC m=+0.181265325 container start d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mirzakhani, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.expose-services=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, version=7, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:04:11 localhost podman[309740]: 2025-10-14 10:04:11.654640004 +0000 UTC m=+0.184351539 container attach d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mirzakhani, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , ceph=True) Oct 14 06:04:11 localhost peaceful_mirzakhani[309755]: 167 167 Oct 14 06:04:11 localhost systemd[1]: libpod-d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb.scope: Deactivated successfully. Oct 14 06:04:11 localhost podman[309740]: 2025-10-14 10:04:11.659333688 +0000 UTC m=+0.189045263 container died d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mirzakhani, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, name=rhceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64) Oct 14 06:04:11 localhost ceph-mon[303906]: Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:04:11 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:04:11 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:11 localhost ceph-mon[303906]: Removed label _admin from host np0005486729.localdomain Oct 14 06:04:11 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:11 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:11 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:11 localhost podman[309760]: 2025-10-14 10:04:11.762875289 +0000 UTC m=+0.095685783 container remove d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mirzakhani, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=) Oct 14 06:04:11 localhost systemd[1]: libpod-conmon-d64b124d3f12927e3b7fed029a393497942db75a738e97b3fda0fa0afd3719cb.scope: Deactivated successfully. Oct 14 06:04:12 localhost nova_compute[297686]: 2025-10-14 10:04:12.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:12 localhost podman[309829]: Oct 14 06:04:12 localhost podman[309829]: 2025-10-14 10:04:12.469975524 +0000 UTC m=+0.069027998 container create 93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_shannon, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux , version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, GIT_CLEAN=True, RELEASE=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.buildah.version=1.33.12) Oct 14 06:04:12 localhost systemd[1]: Started libpod-conmon-93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f.scope. Oct 14 06:04:12 localhost systemd[1]: Started libcrun container. Oct 14 06:04:12 localhost podman[309829]: 2025-10-14 10:04:12.437566275 +0000 UTC m=+0.036618779 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:12 localhost podman[309829]: 2025-10-14 10:04:12.537712642 +0000 UTC m=+0.136765106 container init 93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_shannon, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_BRANCH=main) Oct 14 06:04:12 localhost agitated_shannon[309844]: 167 167 Oct 14 06:04:12 localhost podman[309829]: 2025-10-14 10:04:12.543568981 +0000 UTC m=+0.142621465 container start 93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_shannon, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True) Oct 14 06:04:12 localhost podman[309829]: 2025-10-14 10:04:12.54387351 +0000 UTC m=+0.142925984 container attach 93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_shannon, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Oct 14 06:04:12 localhost systemd[1]: libpod-93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f.scope: Deactivated successfully. Oct 14 06:04:12 localhost podman[309829]: 2025-10-14 10:04:12.546432268 +0000 UTC m=+0.145484732 container died 93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_shannon, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:04:12 localhost systemd[1]: var-lib-containers-storage-overlay-1b6d6fc14e6ca6656705fa6c65003277fda721b6083a5c20b73a23295e823920-merged.mount: Deactivated successfully. Oct 14 06:04:12 localhost systemd[1]: var-lib-containers-storage-overlay-820eaec14ee4d9f6ac330595d986e48fe19b7612b76a37fbda72e2c52ee8f4df-merged.mount: Deactivated successfully. Oct 14 06:04:12 localhost podman[309849]: 2025-10-14 10:04:12.62674768 +0000 UTC m=+0.076138315 container remove 93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_shannon, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True) Oct 14 06:04:12 localhost systemd[1]: libpod-conmon-93342111a4a93f5c09c099a4075e24c07847a406a35f2f8e30b4064ab98c1b7f.scope: Deactivated successfully. Oct 14 06:04:12 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:04:12 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:04:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:12 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:04:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:04:12 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:04:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:04:13 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:13 localhost podman[309925]: Oct 14 06:04:13 localhost podman[309925]: 2025-10-14 10:04:13.425052831 +0000 UTC m=+0.071258147 container create c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_ride, io.buildah.version=1.33.12, version=7, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55, RELEASE=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:04:13 localhost systemd[1]: Started libpod-conmon-c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4.scope. Oct 14 06:04:13 localhost systemd[1]: Started libcrun container. Oct 14 06:04:13 localhost podman[309925]: 2025-10-14 10:04:13.483090793 +0000 UTC m=+0.129296139 container init c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_ride, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64) Oct 14 06:04:13 localhost podman[309925]: 2025-10-14 10:04:13.493532271 +0000 UTC m=+0.139737617 container start c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_ride, build-date=2025-09-24T08:57:55, distribution-scope=public, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64) Oct 14 06:04:13 localhost podman[309925]: 2025-10-14 10:04:13.493910162 +0000 UTC m=+0.140115558 container attach c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_ride, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, release=553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, CEPH_POINT_RELEASE=) Oct 14 06:04:13 localhost vibrant_ride[309941]: 167 167 Oct 14 06:04:13 localhost systemd[1]: libpod-c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4.scope: Deactivated successfully. Oct 14 06:04:13 localhost podman[309925]: 2025-10-14 10:04:13.499326858 +0000 UTC m=+0.145532244 container died c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_ride, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., ceph=True) Oct 14 06:04:13 localhost podman[309925]: 2025-10-14 10:04:13.402383079 +0000 UTC m=+0.048588435 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:13 localhost systemd[1]: var-lib-containers-storage-overlay-fc913c8fd28ffd5b6adbf3bef2a3d452d66d426a451d94b99792b3bc9dfd4c6f-merged.mount: Deactivated successfully. Oct 14 06:04:13 localhost podman[309946]: 2025-10-14 10:04:13.609180242 +0000 UTC m=+0.099914471 container remove c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_ride, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, release=553, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:04:13 localhost systemd[1]: libpod-conmon-c8117a35643a02f6fa4e3170272cb3522ec9c7d59e6d77ea8c19b110710820c4.scope: Deactivated successfully. Oct 14 06:04:13 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:04:13 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:04:13 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:13 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:13 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:04:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:04:14 localhost podman[310005]: 2025-10-14 10:04:14.143975738 +0000 UTC m=+0.103703717 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:04:14 localhost podman[310005]: 2025-10-14 10:04:14.156262323 +0000 UTC m=+0.115990262 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:04:14 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:04:14 localhost podman[310040]: Oct 14 06:04:14 localhost podman[310040]: 2025-10-14 10:04:14.508547587 +0000 UTC m=+0.083798239 container create 184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_haslett, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Oct 14 06:04:14 localhost systemd[1]: Started libpod-conmon-184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803.scope. Oct 14 06:04:14 localhost podman[310040]: 2025-10-14 10:04:14.472427675 +0000 UTC m=+0.047678377 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:14 localhost systemd[1]: Started libcrun container. Oct 14 06:04:14 localhost podman[310040]: 2025-10-14 10:04:14.595017127 +0000 UTC m=+0.170267769 container init 184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_haslett, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True) Oct 14 06:04:14 localhost podman[310040]: 2025-10-14 10:04:14.60494235 +0000 UTC m=+0.180193022 container start 184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_haslett, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553) Oct 14 06:04:14 localhost podman[310040]: 2025-10-14 10:04:14.605194147 +0000 UTC m=+0.180444809 container attach 184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_haslett, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container) Oct 14 06:04:14 localhost optimistic_haslett[310055]: 167 167 Oct 14 06:04:14 localhost systemd[1]: libpod-184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803.scope: Deactivated successfully. Oct 14 06:04:14 localhost podman[310040]: 2025-10-14 10:04:14.610646174 +0000 UTC m=+0.185896906 container died 184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_haslett, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux ) Oct 14 06:04:14 localhost nova_compute[297686]: 2025-10-14 10:04:14.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:14 localhost podman[310060]: 2025-10-14 10:04:14.715734192 +0000 UTC m=+0.095667342 container remove 184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_haslett, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:04:14 localhost systemd[1]: libpod-conmon-184016dcaf85230b521f812311b331c02a27a9bf5e4e005b35a14e211fcb6803.scope: Deactivated successfully. Oct 14 06:04:14 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:04:14 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:04:14 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:14 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:14 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:04:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:04:15 localhost podman[310127]: 2025-10-14 10:04:15.439119415 +0000 UTC m=+0.096252380 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:04:15 localhost podman[310127]: 2025-10-14 10:04:15.446855372 +0000 UTC m=+0.103988357 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:04:15 localhost podman[310147]: Oct 14 06:04:15 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:04:15 localhost podman[310126]: 2025-10-14 10:04:15.490008709 +0000 UTC m=+0.147504455 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:04:15 localhost podman[310126]: 2025-10-14 10:04:15.499830989 +0000 UTC m=+0.157326825 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:04:15 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:04:15 localhost podman[310147]: 2025-10-14 10:04:15.523954255 +0000 UTC m=+0.148051961 container create 578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_black, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, version=7, architecture=x86_64, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph) Oct 14 06:04:15 localhost podman[310147]: 2025-10-14 10:04:15.43697344 +0000 UTC m=+0.061071216 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:15 localhost systemd[1]: Started libpod-conmon-578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831.scope. Oct 14 06:04:15 localhost systemd[1]: var-lib-containers-storage-overlay-acae88a033fb46027508d142b0e72634fbaa5dbbbf8382a141c232b7b1e865c0-merged.mount: Deactivated successfully. Oct 14 06:04:15 localhost systemd[1]: Started libcrun container. Oct 14 06:04:15 localhost podman[310147]: 2025-10-14 10:04:15.586312828 +0000 UTC m=+0.210410604 container init 578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_black, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, architecture=x86_64, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True) Oct 14 06:04:15 localhost systemd[1]: tmp-crun.SxyqKj.mount: Deactivated successfully. Oct 14 06:04:15 localhost podman[310147]: 2025-10-14 10:04:15.606239328 +0000 UTC m=+0.230337054 container start 578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_black, io.openshift.tags=rhceph ceph, release=553, GIT_BRANCH=main, version=7, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:04:15 localhost podman[310147]: 2025-10-14 10:04:15.606582598 +0000 UTC m=+0.230680374 container attach 578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_black, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True) Oct 14 06:04:15 localhost youthful_black[310184]: 167 167 Oct 14 06:04:15 localhost systemd[1]: libpod-578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831.scope: Deactivated successfully. Oct 14 06:04:15 localhost podman[310147]: 2025-10-14 10:04:15.611387354 +0000 UTC m=+0.235485120 container died 578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_black, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:04:15 localhost podman[310189]: 2025-10-14 10:04:15.714973426 +0000 UTC m=+0.094575027 container remove 578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_black, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux ) Oct 14 06:04:15 localhost systemd[1]: libpod-conmon-578891d543759d39ee57360e585a635674aab1c85872001c5ed95bbb59b9a831.scope: Deactivated successfully. Oct 14 06:04:15 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:04:15 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:04:15 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:15 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:15 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:16 localhost nova_compute[297686]: 2025-10-14 10:04:16.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:16 localhost nova_compute[297686]: 2025-10-14 10:04:16.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:16 localhost nova_compute[297686]: 2025-10-14 10:04:16.279 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:16 localhost podman[310259]: Oct 14 06:04:16 localhost podman[310259]: 2025-10-14 10:04:16.430378866 +0000 UTC m=+0.078750484 container create 0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_antonelli, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public) Oct 14 06:04:16 localhost systemd[1]: Started libpod-conmon-0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc.scope. Oct 14 06:04:16 localhost systemd[1]: Started libcrun container. Oct 14 06:04:16 localhost podman[310259]: 2025-10-14 10:04:16.398101911 +0000 UTC m=+0.046473489 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:16 localhost podman[310259]: 2025-10-14 10:04:16.502532718 +0000 UTC m=+0.150904276 container init 0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_antonelli, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.12, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, name=rhceph, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:04:16 localhost podman[310259]: 2025-10-14 10:04:16.512218605 +0000 UTC m=+0.160590153 container start 0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_antonelli, io.buildah.version=1.33.12, architecture=x86_64, release=553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_CLEAN=True, version=7) Oct 14 06:04:16 localhost podman[310259]: 2025-10-14 10:04:16.512887255 +0000 UTC m=+0.161258893 container attach 0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_antonelli, RELEASE=main, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public) Oct 14 06:04:16 localhost objective_antonelli[310274]: 167 167 Oct 14 06:04:16 localhost systemd[1]: libpod-0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc.scope: Deactivated successfully. Oct 14 06:04:16 localhost podman[310259]: 2025-10-14 10:04:16.518598769 +0000 UTC m=+0.166970327 container died 0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_antonelli, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:04:16 localhost systemd[1]: tmp-crun.2uL1hR.mount: Deactivated successfully. Oct 14 06:04:16 localhost systemd[1]: var-lib-containers-storage-overlay-d8ee85fc35f8ebd115733e615ae9eeb0ad6765f2080dbe3b8ca2b8c4504ecc25-merged.mount: Deactivated successfully. Oct 14 06:04:16 localhost systemd[1]: var-lib-containers-storage-overlay-ef3f437dcd7abf54ccb4f43076a244f4d2b7b5e83bfec966317105e8fdf4192a-merged.mount: Deactivated successfully. Oct 14 06:04:16 localhost podman[310279]: 2025-10-14 10:04:16.616663823 +0000 UTC m=+0.086906684 container remove 0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_antonelli, GIT_CLEAN=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , name=rhceph, release=553, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:04:16 localhost systemd[1]: libpod-conmon-0998e3f3f9c2ece79d19d05a546a3de7446c8ebbb95ebe614631490ced51b4bc.scope: Deactivated successfully. Oct 14 06:04:16 localhost ceph-mon[303906]: Reconfiguring mon.np0005486733 (monmap changed)... Oct 14 06:04:16 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:04:16 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:16 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:17 localhost nova_compute[297686]: 2025-10-14 10:04:17.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:18 localhost nova_compute[297686]: 2025-10-14 10:04:18.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:18 localhost nova_compute[297686]: 2025-10-14 10:04:18.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:18 localhost nova_compute[297686]: 2025-10-14 10:04:18.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:04:18 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:19 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:19 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:19 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:04:19 localhost ceph-mon[303906]: Removing np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:19 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:04:19 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:04:19 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:04:19 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:04:19 localhost ceph-mon[303906]: Removing np0005486729.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:04:19 localhost ceph-mon[303906]: Removing np0005486729.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:04:19 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:19 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:19 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:19 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:19 localhost nova_compute[297686]: 2025-10-14 10:04:19.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:20 localhost nova_compute[297686]: 2025-10-14 10:04:20.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:20 localhost nova_compute[297686]: 2025-10-14 10:04:20.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:04:20 localhost nova_compute[297686]: 2025-10-14 10:04:20.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:04:20 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:20 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:20 localhost ceph-mon[303906]: Removing daemon mgr.np0005486729.xpybho from np0005486729.localdomain -- ports [8765] Oct 14 06:04:21 localhost nova_compute[297686]: 2025-10-14 10:04:21.292 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:04:21 localhost nova_compute[297686]: 2025-10-14 10:04:21.292 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:04:21 localhost nova_compute[297686]: 2025-10-14 10:04:21.293 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:04:21 localhost nova_compute[297686]: 2025-10-14 10:04:21.293 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:04:22 localhost nova_compute[297686]: 2025-10-14 10:04:22.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:22 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "mgr.np0005486729.xpybho"} : dispatch Oct 14 06:04:22 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005486729.xpybho"}]': finished Oct 14 06:04:22 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:22 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:22 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:22 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:23 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:04:23 localhost podman[310635]: 2025-10-14 10:04:23.752746851 +0000 UTC m=+0.084308066 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:04:23 localhost podman[310635]: 2025-10-14 10:04:23.788484011 +0000 UTC m=+0.120045176 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:04:23 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:04:23 localhost systemd[1]: tmp-crun.YHu8US.mount: Deactivated successfully. Oct 14 06:04:23 localhost systemd[1]: tmp-crun.6wEPzs.mount: Deactivated successfully. Oct 14 06:04:23 localhost ceph-mon[303906]: Removing key for mgr.np0005486729.xpybho Oct 14 06:04:23 localhost podman[310633]: 2025-10-14 10:04:23.859326925 +0000 UTC m=+0.194027315 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:04:23 localhost ceph-mon[303906]: Added label _no_schedule to host np0005486729.localdomain Oct 14 06:04:23 localhost ceph-mon[303906]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005486729.localdomain Oct 14 06:04:23 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:23 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:23 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:04:23 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:23 localhost podman[310634]: 2025-10-14 10:04:23.833371722 +0000 UTC m=+0.168210976 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6) Oct 14 06:04:23 localhost podman[310633]: 2025-10-14 10:04:23.911118545 +0000 UTC m=+0.245818925 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:04:23 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:04:23 localhost podman[310634]: 2025-10-14 10:04:23.965174795 +0000 UTC m=+0.300014059 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350) Oct 14 06:04:23 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.346 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.366 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.367 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.368 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.368 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.369 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.369 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.408 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.408 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.409 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.409 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.410 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:24 localhost ceph-mon[303906]: Removing daemon crash.np0005486729 from np0005486729.localdomain -- ports [] Oct 14 06:04:24 localhost nova_compute[297686]: 2025-10-14 10:04:24.927 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.003 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.004 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.255 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.256 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11440MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.257 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.257 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.332 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.333 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.333 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.366 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:04:25 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:04:25 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1324407786' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.800 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.807 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.827 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.830 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:04:25 localhost nova_compute[297686]: 2025-10-14 10:04:25.831 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain"} : dispatch Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain"}]': finished Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005486729.localdomain"} : dispatch Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005486729.localdomain"}]': finished Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:25 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:26 localhost ceph-mon[303906]: Removed host np0005486729.localdomain Oct 14 06:04:26 localhost ceph-mon[303906]: Removing key for client.crash.np0005486729.localdomain Oct 14 06:04:26 localhost ceph-mon[303906]: Reconfiguring mon.np0005486730 (monmap changed)... Oct 14 06:04:26 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486730 on np0005486730.localdomain Oct 14 06:04:26 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:26 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:26 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:27 localhost nova_compute[297686]: 2025-10-14 10:04:27.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:27 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:04:27 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:04:27 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:27 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:27 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:28 localhost podman[248187]: time="2025-10-14T10:04:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:04:28 localhost podman[248187]: @ - - [14/Oct/2025:10:04:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:04:28 localhost podman[248187]: @ - - [14/Oct/2025:10:04:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19847 "" "Go-http-client/1.1" Oct 14 06:04:28 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:28 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:04:28 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:04:28 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:28 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:28 localhost ceph-mon[303906]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:04:28 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:28 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:04:29 localhost nova_compute[297686]: 2025-10-14 10:04:29.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:30 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:30 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:30 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:04:30 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:04:30 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:04:30 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:31 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:31 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:04:31 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:04:31 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:04:31 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:31 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:31 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:04:32 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:04:32 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:04:32 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:32 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:32 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:32 localhost nova_compute[297686]: 2025-10-14 10:04:32.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:33 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:33 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:04:33 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:04:33 localhost ceph-mon[303906]: Saving service mon spec with placement label:mon Oct 14 06:04:33 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:33 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:33 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:33 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:04:33 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:34 localhost nova_compute[297686]: 2025-10-14 10:04:34.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:35 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da4e7fa000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Oct 14 06:04:35 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:04:35 localhost ceph-mon[303906]: paxos.1).electionLogic(42) init, last seen epoch 42 Oct 14 06:04:35 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:35 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:37 localhost nova_compute[297686]: 2025-10-14 10:04:37.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:04:38 localhost podman[310791]: 2025-10-14 10:04:38.744131914 +0000 UTC m=+0.085415499 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:04:38 localhost podman[310791]: 2025-10-14 10:04:38.750396595 +0000 UTC m=+0.091680220 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:04:38 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:04:38 localhost openstack_network_exporter[250374]: ERROR 10:04:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:04:38 localhost openstack_network_exporter[250374]: ERROR 10:04:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:04:38 localhost openstack_network_exporter[250374]: ERROR 10:04:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:04:38 localhost openstack_network_exporter[250374]: ERROR 10:04:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:04:38 localhost openstack_network_exporter[250374]: Oct 14 06:04:38 localhost openstack_network_exporter[250374]: ERROR 10:04:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:04:38 localhost openstack_network_exporter[250374]: Oct 14 06:04:38 localhost podman[310792]: 2025-10-14 10:04:38.800416342 +0000 UTC m=+0.137171559 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:04:38 localhost podman[310792]: 2025-10-14 10:04:38.833180523 +0000 UTC m=+0.169935810 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:04:38 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:04:39 localhost nova_compute[297686]: 2025-10-14 10:04:39.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:40 localhost ceph-mds[301155]: mds.beacon.mds.np0005486733.tvstmf missed beacon ack from the monitors Oct 14 06:04:40 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:40 localhost ceph-mon[303906]: Remove daemons mon.np0005486732 Oct 14 06:04:40 localhost ceph-mon[303906]: Safe to remove mon.np0005486732: new quorum should be ['np0005486730', 'np0005486733', 'np0005486731'] (from ['np0005486730', 'np0005486733', 'np0005486731']) Oct 14 06:04:40 localhost ceph-mon[303906]: Removing monitor np0005486732 from monmap... Oct 14 06:04:40 localhost ceph-mon[303906]: Removing daemon mon.np0005486732 from np0005486732.localdomain -- ports [] Oct 14 06:04:40 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:04:40 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:04:40 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486733 in quorum (ranks 0,1) Oct 14 06:04:40 localhost ceph-mon[303906]: Health check failed: 1/3 mons down, quorum np0005486730,np0005486733 (MON_DOWN) Oct 14 06:04:40 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005486730,np0005486733 Oct 14 06:04:40 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Oct 14 06:04:40 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:04:40 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:04:40 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:04:40 localhost ceph-mon[303906]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005486730,np0005486733 Oct 14 06:04:40 localhost ceph-mon[303906]: mon.np0005486731 (rank 2) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Oct 14 06:04:40 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:40 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:41 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:04:41 localhost ceph-mon[303906]: paxos.1).electionLogic(45) init, last seen epoch 45, mid-election, bumping Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:04:41 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486733,np0005486731 in quorum (ranks 0,1,2) Oct 14 06:04:41 localhost ceph-mon[303906]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005486730,np0005486733) Oct 14 06:04:41 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:04:41 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Oct 14 06:04:41 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:04:41 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:04:41 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:04:42 localhost nova_compute[297686]: 2025-10-14 10:04:42.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:42 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:42 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:42 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:42 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:42 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:43 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:43 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:04:43 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:04:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:43 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:04:44 localhost podman[311169]: 2025-10-14 10:04:44.756046904 +0000 UTC m=+0.091553186 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:04:44 localhost podman[311169]: 2025-10-14 10:04:44.768201975 +0000 UTC m=+0.103708247 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2) Oct 14 06:04:44 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:04:44 localhost nova_compute[297686]: 2025-10-14 10:04:44.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:44 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:04:44 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:04:44 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:44 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:44 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:04:45 localhost podman[311188]: 2025-10-14 10:04:45.756156695 +0000 UTC m=+0.094559098 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:04:45 localhost podman[311188]: 2025-10-14 10:04:45.76517037 +0000 UTC m=+0.103572803 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:04:45 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:04:45 localhost podman[311187]: 2025-10-14 10:04:45.855396734 +0000 UTC m=+0.197097718 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:04:45 localhost podman[311187]: 2025-10-14 10:04:45.869111933 +0000 UTC m=+0.210812917 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS) Oct 14 06:04:45 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:04:45 localhost ceph-mon[303906]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:04:45 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:04:45 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:45 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:45 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:04:45 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:46 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:04:46 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:04:46 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:46 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:46 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:04:46 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:04:46 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:04:47 localhost nova_compute[297686]: 2025-10-14 10:04:47.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:48 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:04:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:04:48 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:04:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:48 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:48 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:49 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:04:49 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:04:49 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:49 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:04:49 localhost ceph-mon[303906]: Deploying daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:04:49 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:49 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:49 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:04:49 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.818 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.819 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.839 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 13150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '892b5d2a-9ea0-4042-88b6-8ce217183d33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13150000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:04:49.819890', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '388fbd86-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.032529166, 'message_signature': '8a775fbd492df3f1c3f1370e0498843b8f1c6dd1671c6047d2e7f2188125abe1'}]}, 'timestamp': '2025-10-14 10:04:49.841062', '_unique_id': '502a6991ed2f4b6ea4fbd059bf881d87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.842 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.864 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.865 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10212dbe-9a60-49fa-b6c5-448472337b56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.844626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '389382cc-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': '9817ba47c8589bb6759abf981ff5a7f1377605f8bd6f2394dfbd596c28a7fb9e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.844626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38939a46-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'f76fadb7c0e4927db3e5490bfb3dd52187ba6b50e917e2996811a014a2342096'}]}, 'timestamp': '2025-10-14 10:04:49.866233', '_unique_id': '96862da3b1be44d7b5d931229b8f4453'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost nova_compute[297686]: 2025-10-14 10:04:49.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.867 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.869 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.870 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.870 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cdbfe2c-5c1e-4d77-a658-79c800b2ce0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.869986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38944162-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'eb783f41334434c96b2599e95a8575e76e9961841556754edfe652b8adc23aff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.869986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '389455b2-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'a39e786ca1618b119009a13c677c2b38d9cb79585a9dcf0139770dea2e5283f7'}]}, 'timestamp': '2025-10-14 10:04:49.871012', '_unique_id': '3827e931e6b34f3fb04f69e522d63183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.884 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.885 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1130318a-ade7-4df0-9bff-d87fc2e422a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.873825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38967cd4-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.066559835, 'message_signature': '3994f8f90e23965b97a607534f5e7227f5086a3447197b821ad46f1822dc8df5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.873825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38968a12-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.066559835, 'message_signature': 'a6f12aeb4b0b2615c65d97dbe2bd50e358df932d20ad373e626d283a5d7b5f7e'}]}, 'timestamp': '2025-10-14 10:04:49.885385', '_unique_id': '72f2ad0770fe478280285a21315d7135'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.889 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b364729c-a5b7-4070-9b6b-65f3b6b4e520', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.887302', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '3897440c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': 'ed0a92b72ab71b50f1fbfffe49f83d1edd737dca4c459e43bb4554a251801d20'}]}, 'timestamp': '2025-10-14 10:04:49.890168', '_unique_id': '8a85880d1f8040fbba441867c362a95d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.890 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2943afdc-877b-432a-92e3-1e3dbd3b31c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.891854', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '3897927c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': 'd1b15cc834d6548ea18ccfdb565797bd90982d49044aa9579080977af6381208'}]}, 'timestamp': '2025-10-14 10:04:49.892168', '_unique_id': 'bd406f761bef4f7a99cb7a240d6f9c4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.893 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed82ba86-505c-4ccd-991c-ffb3f08dc12a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.893733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3897dc0a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'a28123d822b15355b24565676938eef2d4447b0f17c01696ae5d08db6e3803fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.893733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3897e8da-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'cd119be1752f7ac3e2064d7ef1456d443de10ac0d06490196d079b0593060719'}]}, 'timestamp': '2025-10-14 10:04:49.894407', '_unique_id': 'd961ff876ae24925be1f6d5fa841b0b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.896 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.896 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '835afcb3-f2fe-4d8a-8bb3-6bed3c25acf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.896147', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '38983aec-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': '3c052e546edb06c2481f34df15481befe479b159b26a8e7fae9fa2ffd3fbc16b'}]}, 'timestamp': '2025-10-14 10:04:49.896499', '_unique_id': '8563a7f03bc64960bd154e0294fa6356'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.898 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92fb7f71-7d9c-4b0f-9c77-38dded0238da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.898150', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '38988916-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': '62be83ef2c6f3186969b3e23e9edcfde111407d9ced5efb6e9ab3c63a33b2b1c'}]}, 'timestamp': '2025-10-14 10:04:49.898504', '_unique_id': '0283316f4d9b46d0aadb16883b19a4a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.900 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '457effb0-fe38-402d-912a-968b8809c17c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:04:49.900214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '3898d9de-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.032529166, 'message_signature': '31f07ab08c6f4d92e8e8ca612b96ac70cf2f151cfca0b175c5f9bc08f0a1edfc'}]}, 'timestamp': '2025-10-14 10:04:49.900568', '_unique_id': '1d884b15b6f34fd69c1942942817ea01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '627b46a5-a86e-4322-95f4-085b04373d60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.902214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '389926fa-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'c9933d0a1e39d6f67ffac1d58bc41345c9d942158a53a48413f405bc02790e98'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.902214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38993244-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': '8cbee74e4ec28de97cd6e26e70154faaa693e34f87ffafdd085288a3870ea0e3'}]}, 'timestamp': '2025-10-14 10:04:49.902805', '_unique_id': 'd9fa50fbbfb64e7ab142a2cb86b63638'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8075b74c-8b44-4106-9d1d-4f282004a9ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.904227', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3899757e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'd75c0afd2b7cfe3968540c69e07917ca0e5ee2e061b67857284906919ea9f243'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.904227', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38998096-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'cecbe4f1c76d5ebbb28bf68bdd30091e1cc8a72ce7960382eaa3ffa0206a3bd3'}]}, 'timestamp': '2025-10-14 10:04:49.904836', '_unique_id': '4d39b82da2f344df9e027770fe7d2612'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7a84ce0-951f-4337-b1a7-f867290fc564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.906822', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3899dd70-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.066559835, 'message_signature': 'c3177bfaff20addc0edc23916f1ce21c62fcefcbdd45a41c68e7ff6cfabf3f55'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.906822', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3899ec7a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.066559835, 'message_signature': '5ab0b0b2cab4b892e8386f72a0ab28a4a3a51b9674cd6e62d3cf92c1921e83a7'}]}, 'timestamp': '2025-10-14 10:04:49.907614', '_unique_id': '262dda92e130451f94035ece23483e39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b7220ca-ee3e-4b04-ab34-3f4108492a70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.909619', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '389a4c24-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': '95e2ef7a9593bc1fe5cb6974d447911b631873db5ba9d7381509c275f7ee7a51'}]}, 'timestamp': '2025-10-14 10:04:49.910083', '_unique_id': '24d396247ce74026aab5ab43cd897665'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72d0ee36-f79e-4edd-b8b7-6b245e8141e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.912041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '389aa930-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.066559835, 'message_signature': 'f239cdb52e6333c0653a2a63680f02fffb41152d678896bf1054a0a2d1039ec8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.912041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '389ab88a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.066559835, 'message_signature': 'f34c245e3167f1357018867c653a1db54f7c8d80c085752e39cbee03c106b13d'}]}, 'timestamp': '2025-10-14 10:04:49.912862', '_unique_id': 'a3b68c12f4b74d69a3831c5bc06394e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d932546-e7c8-468e-8e0f-69397d1c1409', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.915042', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '389b1e9c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': '0b69131768c81c8b57d4050774de41231b8b30827c869dbaecb8e9659f38e5a2'}]}, 'timestamp': '2025-10-14 10:04:49.915478', '_unique_id': 'fce4e05cdf1a4c10928786c2eaad55ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0f07368-e441-488b-88c0-7479d8ec14c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:04:49.917052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '389b6a3c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': '5744648cbfcf672934fefc41e72384865d4bd2eac5629b004b3b6d20a868bced'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:04:49.917052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '389b7446-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.037424725, 'message_signature': 'f5abad43465c0574d76e12ce7f4a5b84dc9b503ac1233af746b44251afea808f'}]}, 'timestamp': '2025-10-14 10:04:49.917572', '_unique_id': 'b603eb5fe833474aa4f3434e008a4e89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c5cbe2d-0c1c-42cb-b60b-9e5a6021d7fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.919019', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '389bb71c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': 'c0bc945bcfd5949303b37b23e7cc3ce2e06ad2109469968ebbcee7f3075ed600'}]}, 'timestamp': '2025-10-14 10:04:49.919304', '_unique_id': '0e2d7cee143049d8982207852a7d4cee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2648a8fe-648f-445c-aed1-87a79bc92461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.920674', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '389bf93e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': '5efd6bd25db1cecd6f68a079571b14d1f406e40d2112ede85adfa575930cd793'}]}, 'timestamp': '2025-10-14 10:04:49.921003', '_unique_id': 'b9af6331333e400c893c78a85372805d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a8387e6-8170-4ac1-8329-52b2f935cd81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.922312', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '389c38cc-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': 'aad488b56bebf82daaaf6a315399261d560d320660616e8a5f55c9612f8d24aa'}]}, 'timestamp': '2025-10-14 10:04:49.922638', '_unique_id': '9c55c556673d460ea97e1be3ff713a6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c78738e9-4a62-4d20-b6aa-4b709d77dfed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:04:49.923985', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '389c790e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12306.080006376, 'message_signature': 'b48646329468c67e1122e80d31d9f1c86fb554b578b18521c9fd55712d3fe310'}]}, 'timestamp': '2025-10-14 10:04:49.924292', '_unique_id': '8410efb0f97141d2a372ad9563cff347'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:04:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:04:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:04:50 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:04:51 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:51 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:51 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:52 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:52 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:52 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:52 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:52 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:04:52 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:04:52 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:04:52 localhost nova_compute[297686]: 2025-10-14 10:04:52.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:53 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:53 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:53 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:53 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:04:53 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:04:53 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:04:53 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:04:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:04:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:04:54 localhost systemd[1]: tmp-crun.cS0Edw.mount: Deactivated successfully. Oct 14 06:04:54 localhost podman[311230]: 2025-10-14 10:04:54.757283709 +0000 UTC m=+0.088984708 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:04:54 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:54 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:54 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:04:54 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:04:54 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:04:54 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:54 localhost podman[311228]: 2025-10-14 10:04:54.798860428 +0000 UTC m=+0.136609572 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:04:54 localhost podman[311230]: 2025-10-14 10:04:54.851790334 +0000 UTC m=+0.183491343 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:04:54 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:04:54 localhost podman[311228]: 2025-10-14 10:04:54.869002029 +0000 UTC m=+0.206751163 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:04:54 localhost podman[311229]: 2025-10-14 10:04:54.871120664 +0000 UTC m=+0.203279407 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 06:04:54 localhost nova_compute[297686]: 2025-10-14 10:04:54.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:54 localhost podman[311229]: 2025-10-14 10:04:54.885091581 +0000 UTC m=+0.217250394 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 06:04:54 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:04:54 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.616506) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436295616575, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2201, "num_deletes": 251, "total_data_size": 3930160, "memory_usage": 3985520, "flush_reason": "Manual Compaction"} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436295629412, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2031772, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16175, "largest_seqno": 18371, "table_properties": {"data_size": 2023019, "index_size": 5006, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 24514, "raw_average_key_size": 23, "raw_value_size": 2003209, "raw_average_value_size": 1884, "num_data_blocks": 219, "num_entries": 1063, "num_filter_entries": 1063, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436245, "oldest_key_time": 1760436245, "file_creation_time": 1760436295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 12960 microseconds, and 6048 cpu microseconds. Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.629462) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2031772 bytes OK Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.629491) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.632086) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.632151) EVENT_LOG_v1 {"time_micros": 1760436295632136, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.632188) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3919278, prev total WAL file size 3919602, number of live WAL files 2. Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.633492) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373535' seq:72057594037927935, type:22 .. '6D6772737461740034303036' seq:0, type:0; will stop at (end) Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1984KB)], [24(15MB)] Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436295633579, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 17856574, "oldest_snapshot_seqno": -1} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10898 keys, 15796220 bytes, temperature: kUnknown Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436295700892, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 15796220, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15733591, "index_size": 34304, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27269, "raw_key_size": 290876, "raw_average_key_size": 26, "raw_value_size": 15547074, "raw_average_value_size": 1426, "num_data_blocks": 1310, "num_entries": 10898, "num_filter_entries": 10898, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.701182) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 15796220 bytes Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.702964) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 265.0 rd, 234.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 15.1 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(16.6) write-amplify(7.8) OK, records in: 11420, records dropped: 522 output_compression: NoCompression Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.702997) EVENT_LOG_v1 {"time_micros": 1760436295702984, "job": 12, "event": "compaction_finished", "compaction_time_micros": 67380, "compaction_time_cpu_micros": 39904, "output_level": 6, "num_output_files": 1, "total_output_size": 15796220, "num_input_records": 11420, "num_output_records": 10898, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436295703550, "job": 12, "event": "table_file_deletion", "file_number": 26} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436295706086, "job": 12, "event": "table_file_deletion", "file_number": 24} Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.633393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.706153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.706161) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.706164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.706167) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:55 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:04:55.706170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:04:55 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:55 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:55 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:04:55 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:55 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:04:55 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:55 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:55 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:04:55 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:56 localhost podman[311343]: Oct 14 06:04:56 localhost podman[311343]: 2025-10-14 10:04:56.290747071 +0000 UTC m=+0.079196768 container create 25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_leavitt, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, ceph=True, release=553) Oct 14 06:04:56 localhost systemd[1]: Started libpod-conmon-25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88.scope. Oct 14 06:04:56 localhost systemd[1]: Started libcrun container. Oct 14 06:04:56 localhost podman[311343]: 2025-10-14 10:04:56.256235868 +0000 UTC m=+0.044685545 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:56 localhost podman[311343]: 2025-10-14 10:04:56.365060541 +0000 UTC m=+0.153510178 container init 25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_leavitt, io.buildah.version=1.33.12, ceph=True, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55) Oct 14 06:04:56 localhost podman[311343]: 2025-10-14 10:04:56.377662255 +0000 UTC m=+0.166111892 container start 25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_leavitt, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:04:56 localhost podman[311343]: 2025-10-14 10:04:56.37847364 +0000 UTC m=+0.166923317 container attach 25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_leavitt, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, ceph=True, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Oct 14 06:04:56 localhost elegant_leavitt[311358]: 167 167 Oct 14 06:04:56 localhost systemd[1]: libpod-25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88.scope: Deactivated successfully. Oct 14 06:04:56 localhost podman[311343]: 2025-10-14 10:04:56.383945797 +0000 UTC m=+0.172395484 container died 25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_leavitt, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:04:56 localhost podman[311363]: 2025-10-14 10:04:56.485564819 +0000 UTC m=+0.090100981 container remove 25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_leavitt, com.redhat.component=rhceph-container, release=553, RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True) Oct 14 06:04:56 localhost systemd[1]: libpod-conmon-25e1462cd973cfa3c0fa7e02d00daf399db078a02945e81449f3d6157e27ed88.scope: Deactivated successfully. Oct 14 06:04:56 localhost systemd[1]: var-lib-containers-storage-overlay-de1f65eb4ea4628b1aa578f313e4fdd8961bd57a7ad22c0c58b45aaf87860e13-merged.mount: Deactivated successfully. Oct 14 06:04:56 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:04:56 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:04:56 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:56 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:56 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:04:57 localhost podman[311433]: Oct 14 06:04:57 localhost podman[311433]: 2025-10-14 10:04:57.202048912 +0000 UTC m=+0.079611062 container create 0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_hertz, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:04:57 localhost systemd[1]: Started libpod-conmon-0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998.scope. Oct 14 06:04:57 localhost systemd[1]: Started libcrun container. Oct 14 06:04:57 localhost podman[311433]: 2025-10-14 10:04:57.260137635 +0000 UTC m=+0.137699795 container init 0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_hertz, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.buildah.version=1.33.12, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Oct 14 06:04:57 localhost podman[311433]: 2025-10-14 10:04:57.166754324 +0000 UTC m=+0.044316524 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:57 localhost podman[311433]: 2025-10-14 10:04:57.269327396 +0000 UTC m=+0.146889556 container start 0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_hertz, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, release=553, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main) Oct 14 06:04:57 localhost podman[311433]: 2025-10-14 10:04:57.269557743 +0000 UTC m=+0.147119943 container attach 0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_hertz, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, release=553, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Oct 14 06:04:57 localhost awesome_hertz[311448]: 167 167 Oct 14 06:04:57 localhost systemd[1]: libpod-0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998.scope: Deactivated successfully. Oct 14 06:04:57 localhost podman[311433]: 2025-10-14 10:04:57.273994258 +0000 UTC m=+0.151556438 container died 0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_hertz, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, release=553, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, version=7, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64) Oct 14 06:04:57 localhost podman[311453]: 2025-10-14 10:04:57.382527481 +0000 UTC m=+0.097732665 container remove 0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_hertz, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553) Oct 14 06:04:57 localhost systemd[1]: libpod-conmon-0ed4134b98a5fac098a3fcc95873171057c2822a070030e384550a5204cc7998.scope: Deactivated successfully. Oct 14 06:04:57 localhost nova_compute[297686]: 2025-10-14 10:04:57.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:04:57 localhost systemd[1]: var-lib-containers-storage-overlay-ff2650ea01eaa0f114cdf76d2a562e5d0c851f6ff24e10748a549b22689a3a70-merged.mount: Deactivated successfully. Oct 14 06:04:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:04:57.774 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:04:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:04:57.774 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:04:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:04:57.775 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:04:57 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:57 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:04:57 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:04:57 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:57 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:57 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:04:58 localhost podman[311529]: Oct 14 06:04:58 localhost podman[311529]: 2025-10-14 10:04:58.207032831 +0000 UTC m=+0.077669421 container create 36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_hugle, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, GIT_BRANCH=main) Oct 14 06:04:58 localhost systemd[1]: Started libpod-conmon-36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5.scope. Oct 14 06:04:58 localhost systemd[1]: Started libcrun container. Oct 14 06:04:58 localhost podman[311529]: 2025-10-14 10:04:58.259085191 +0000 UTC m=+0.129721761 container init 36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_hugle, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12) Oct 14 06:04:58 localhost podman[311529]: 2025-10-14 10:04:58.268494187 +0000 UTC m=+0.139130777 container start 36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_hugle, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public) Oct 14 06:04:58 localhost podman[311529]: 2025-10-14 10:04:58.268798708 +0000 UTC m=+0.139435288 container attach 36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_hugle, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, version=7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:04:58 localhost recursing_hugle[311544]: 167 167 Oct 14 06:04:58 localhost systemd[1]: libpod-36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5.scope: Deactivated successfully. Oct 14 06:04:58 localhost podman[311529]: 2025-10-14 10:04:58.271269752 +0000 UTC m=+0.141906392 container died 36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_hugle, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, distribution-scope=public, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:04:58 localhost podman[311529]: 2025-10-14 10:04:58.18012268 +0000 UTC m=+0.050759290 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:58 localhost podman[248187]: time="2025-10-14T10:04:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:04:58 localhost podman[248187]: @ - - [14/Oct/2025:10:04:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149208 "" "Go-http-client/1.1" Oct 14 06:04:58 localhost podman[248187]: @ - - [14/Oct/2025:10:04:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20159 "" "Go-http-client/1.1" Oct 14 06:04:58 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:04:58 localhost podman[311549]: 2025-10-14 10:04:58.410781651 +0000 UTC m=+0.129369280 container remove 36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_hugle, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, release=553, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:04:58 localhost systemd[1]: libpod-conmon-36a653e916968f14563962e24ae9f5809d30b0df5485ae81aae26f4fc6f4b8d5.scope: Deactivated successfully. Oct 14 06:04:58 localhost systemd[1]: tmp-crun.JB8wEo.mount: Deactivated successfully. Oct 14 06:04:58 localhost systemd[1]: var-lib-containers-storage-overlay-be848f2222ddada70342d428ffa72cadaa293371207c3baeb894a527d9678103-merged.mount: Deactivated successfully. Oct 14 06:04:58 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:04:58 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:04:58 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:58 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:58 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:04:59 localhost podman[311626]: Oct 14 06:04:59 localhost podman[311626]: 2025-10-14 10:04:59.164988465 +0000 UTC m=+0.069527633 container create 870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_blackwell, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main) Oct 14 06:04:59 localhost systemd[1]: Started libpod-conmon-870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a.scope. Oct 14 06:04:59 localhost systemd[1]: Started libcrun container. Oct 14 06:04:59 localhost podman[311626]: 2025-10-14 10:04:59.224192213 +0000 UTC m=+0.128731401 container init 870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_blackwell, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, version=7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:04:59 localhost systemd[1]: tmp-crun.ew8Rm8.mount: Deactivated successfully. Oct 14 06:04:59 localhost podman[311626]: 2025-10-14 10:04:59.237803288 +0000 UTC m=+0.142342486 container start 870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_blackwell, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True) Oct 14 06:04:59 localhost podman[311626]: 2025-10-14 10:04:59.239092388 +0000 UTC m=+0.143631626 container attach 870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_blackwell, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, ceph=True) Oct 14 06:04:59 localhost suspicious_blackwell[311641]: 167 167 Oct 14 06:04:59 localhost systemd[1]: libpod-870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a.scope: Deactivated successfully. Oct 14 06:04:59 localhost podman[311626]: 2025-10-14 10:04:59.140777747 +0000 UTC m=+0.045316965 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:04:59 localhost podman[311626]: 2025-10-14 10:04:59.242003867 +0000 UTC m=+0.146543075 container died 870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_blackwell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container) Oct 14 06:04:59 localhost podman[311646]: 2025-10-14 10:04:59.317803881 +0000 UTC m=+0.068457971 container remove 870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_blackwell, io.buildah.version=1.33.12, ceph=True, GIT_CLEAN=True, vcs-type=git, version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55) Oct 14 06:04:59 localhost systemd[1]: libpod-conmon-870ee4d0a948728936e34d5ebb5f55706f45682d38d3dc00f4afe67240c1c50a.scope: Deactivated successfully. Oct 14 06:04:59 localhost systemd[1]: var-lib-containers-storage-overlay-76894be856e7fd42b2ff216a6d34774ee989c965890fc0ac81abd7e4148dd6f7-merged.mount: Deactivated successfully. Oct 14 06:04:59 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:59 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:04:59 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:04:59 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:04:59 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:59 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:04:59 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:04:59 localhost nova_compute[297686]: 2025-10-14 10:04:59.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:00 localhost podman[311716]: Oct 14 06:05:00 localhost podman[311716]: 2025-10-14 10:05:00.090025395 +0000 UTC m=+0.076999932 container create 703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_turing, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:05:00 localhost systemd[1]: Started libpod-conmon-703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c.scope. Oct 14 06:05:00 localhost podman[311716]: 2025-10-14 10:05:00.058884585 +0000 UTC m=+0.045859182 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:00 localhost systemd[1]: Started libcrun container. Oct 14 06:05:00 localhost podman[311716]: 2025-10-14 10:05:00.184439217 +0000 UTC m=+0.171413744 container init 703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_turing, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, version=7) Oct 14 06:05:00 localhost podman[311716]: 2025-10-14 10:05:00.194560496 +0000 UTC m=+0.181535033 container start 703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_turing, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=553, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux ) Oct 14 06:05:00 localhost podman[311716]: 2025-10-14 10:05:00.194988619 +0000 UTC m=+0.181963276 container attach 703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_turing, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:05:00 localhost laughing_turing[311731]: 167 167 Oct 14 06:05:00 localhost systemd[1]: libpod-703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c.scope: Deactivated successfully. Oct 14 06:05:00 localhost podman[311716]: 2025-10-14 10:05:00.2002582 +0000 UTC m=+0.187232797 container died 703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_turing, GIT_BRANCH=main, release=553, version=7, io.buildah.version=1.33.12, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:05:00 localhost podman[311736]: 2025-10-14 10:05:00.298738716 +0000 UTC m=+0.084252523 container remove 703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_turing, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:05:00 localhost systemd[1]: libpod-conmon-703dbd33d81e4a7691ca269bbbd76a810be0f629ea51bda83fdca157a0adcb7c.scope: Deactivated successfully. Oct 14 06:05:00 localhost systemd[1]: var-lib-containers-storage-overlay-9f1e54890a21cf24fde6b18ab0f6d64214895ccc2a96785fc44f941210792d42-merged.mount: Deactivated successfully. Oct 14 06:05:00 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:05:00 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:05:00 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:00 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:01 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:05:02 localhost nova_compute[297686]: 2025-10-14 10:05:02.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:02 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:02 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:03 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:03 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:05:03 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Oct 14 06:05:03 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:05:03 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:03 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da4e7fa160 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Oct 14 06:05:03 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:05:03 localhost ceph-mon[303906]: paxos.1).electionLogic(48) init, last seen epoch 48 Oct 14 06:05:03 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:03 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:04 localhost nova_compute[297686]: 2025-10-14 10:05:04.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:07 localhost nova_compute[297686]: 2025-10-14 10:05:07.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:08 localhost ceph-mds[301155]: mds.beacon.mds.np0005486733.tvstmf missed beacon ack from the monitors Oct 14 06:05:08 localhost openstack_network_exporter[250374]: ERROR 10:05:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:05:08 localhost openstack_network_exporter[250374]: ERROR 10:05:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:05:08 localhost openstack_network_exporter[250374]: ERROR 10:05:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:05:08 localhost openstack_network_exporter[250374]: Oct 14 06:05:08 localhost openstack_network_exporter[250374]: ERROR 10:05:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:05:08 localhost openstack_network_exporter[250374]: ERROR 10:05:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:05:08 localhost openstack_network_exporter[250374]: Oct 14 06:05:08 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:09 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:05:09 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:05:09 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:05:09 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486733,np0005486731 in quorum (ranks 0,1,2) Oct 14 06:05:09 localhost ceph-mon[303906]: Health check failed: 1/4 mons down, quorum np0005486730,np0005486733,np0005486731 (MON_DOWN) Oct 14 06:05:09 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/4 mons down, quorum np0005486730,np0005486733,np0005486731 Oct 14 06:05:09 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Oct 14 06:05:09 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:09 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:05:09 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:09 localhost ceph-mon[303906]: [WRN] MON_DOWN: 1/4 mons down, quorum np0005486730,np0005486733,np0005486731 Oct 14 06:05:09 localhost ceph-mon[303906]: mon.np0005486732 (rank 3) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Oct 14 06:05:09 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:09 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Oct 14 06:05:09 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/4281847227' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Oct 14 06:05:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:05:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:05:09 localhost podman[311840]: 2025-10-14 10:05:09.764207464 +0000 UTC m=+0.096810267 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:05:09 localhost systemd[1]: tmp-crun.bJ9NGw.mount: Deactivated successfully. Oct 14 06:05:09 localhost podman[311839]: 2025-10-14 10:05:09.797280874 +0000 UTC m=+0.130015661 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:05:09 localhost podman[311840]: 2025-10-14 10:05:09.820204673 +0000 UTC m=+0.152807486 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Oct 14 06:05:09 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:05:09 localhost podman[311839]: 2025-10-14 10:05:09.877064829 +0000 UTC m=+0.209799696 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:05:09 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:05:09 localhost nova_compute[297686]: 2025-10-14 10:05:09.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:10 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:05:10 localhost ceph-mon[303906]: paxos.1).electionLogic(50) init, last seen epoch 50 Oct 14 06:05:10 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:10 localhost ceph-mon[303906]: mon.np0005486733@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:10 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:11 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:05:11 localhost ceph-mon[303906]: Reconfig service osd.default_drive_group Oct 14 06:05:11 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:11 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:11 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:11 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:11 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:05:11 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:05:11 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:05:11 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:05:11 localhost ceph-mon[303906]: mon.np0005486730 is new leader, mons np0005486730,np0005486733,np0005486731,np0005486732 in quorum (ranks 0,1,2,3) Oct 14 06:05:11 localhost ceph-mon[303906]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005486730,np0005486733,np0005486731) Oct 14 06:05:11 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:05:11 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Oct 14 06:05:11 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:11 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Oct 14 06:05:11 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.117172) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436312117219, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 894, "num_deletes": 251, "total_data_size": 2223091, "memory_usage": 2253552, "flush_reason": "Manual Compaction"} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436312127892, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1351053, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18376, "largest_seqno": 19265, "table_properties": {"data_size": 1346673, "index_size": 1979, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 11833, "raw_average_key_size": 22, "raw_value_size": 1337084, "raw_average_value_size": 2494, "num_data_blocks": 84, "num_entries": 536, "num_filter_entries": 536, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436295, "oldest_key_time": 1760436295, "file_creation_time": 1760436312, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 10775 microseconds, and 4449 cpu microseconds. Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.127943) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1351053 bytes OK Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.127968) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.130257) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.130274) EVENT_LOG_v1 {"time_micros": 1760436312130270, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.130292) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2218178, prev total WAL file size 2218470, number of live WAL files 2. Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.130938) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1319KB)], [27(15MB)] Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436312131008, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17147273, "oldest_snapshot_seqno": -1} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 10897 keys, 13994260 bytes, temperature: kUnknown Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436312217819, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13994260, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13933434, "index_size": 32501, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27269, "raw_key_size": 291880, "raw_average_key_size": 26, "raw_value_size": 13748675, "raw_average_value_size": 1261, "num_data_blocks": 1231, "num_entries": 10897, "num_filter_entries": 10897, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436312, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.219157) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13994260 bytes Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.221278) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 197.3 rd, 161.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 15.1 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(23.0) write-amplify(10.4) OK, records in: 11434, records dropped: 537 output_compression: NoCompression Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.221319) EVENT_LOG_v1 {"time_micros": 1760436312221303, "job": 14, "event": "compaction_finished", "compaction_time_micros": 86892, "compaction_time_cpu_micros": 41625, "output_level": 6, "num_output_files": 1, "total_output_size": 13994260, "num_input_records": 11434, "num_output_records": 10897, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436312222254, "job": 14, "event": "table_file_deletion", "file_number": 29} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436312224876, "job": 14, "event": "table_file_deletion", "file_number": 27} Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.130825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.224995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.225001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.225003) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.225006) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:12 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:12.225009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:12 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 handle_command mon_command({"prefix": "mgr fail"} v 0) Oct 14 06:05:12 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/216892054' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e82 e82: 6 total, 6 up, 6 in Oct 14 06:05:12 localhost systemd[1]: session-68.scope: Deactivated successfully. Oct 14 06:05:12 localhost systemd[1]: session-68.scope: Consumed 24.510s CPU time. Oct 14 06:05:12 localhost systemd-logind[760]: Session 68 logged out. Waiting for processes to exit. Oct 14 06:05:12 localhost systemd-logind[760]: Removed session 68. Oct 14 06:05:12 localhost nova_compute[297686]: 2025-10-14 10:05:12.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:12 localhost sshd[312200]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:05:12 localhost systemd-logind[760]: New session 69 of user ceph-admin. Oct 14 06:05:12 localhost systemd[1]: Started Session 69 of User ceph-admin. Oct 14 06:05:12 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:12 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:12 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:12 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17397 172.18.0.106:0/541940411' entity='mgr.np0005486731.swasqz' Oct 14 06:05:12 localhost ceph-mon[303906]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: Activating manager daemon np0005486732.pasqzz Oct 14 06:05:12 localhost ceph-mon[303906]: from='client.? 172.18.0.200:0/216892054' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Oct 14 06:05:12 localhost ceph-mon[303906]: Manager daemon np0005486732.pasqzz is now available Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain.devices.0"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain.devices.0"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain.devices.0"}]': finished Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain.devices.0"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain.devices.0"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486729.localdomain.devices.0"}]': finished Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486732.pasqzz/mirror_snapshot_schedule"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486732.pasqzz/mirror_snapshot_schedule"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486732.pasqzz/trash_purge_schedule"} : dispatch Oct 14 06:05:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486732.pasqzz/trash_purge_schedule"} : dispatch Oct 14 06:05:13 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:13 localhost podman[312311]: 2025-10-14 10:05:13.825076515 +0000 UTC m=+0.092343080 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., release=553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public) Oct 14 06:05:13 localhost ceph-mon[303906]: removing stray HostCache host record np0005486729.localdomain.devices.0 Oct 14 06:05:13 localhost podman[312311]: 2025-10-14 10:05:13.936525978 +0000 UTC m=+0.203792523 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main) Oct 14 06:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:05:14 localhost podman[312471]: 2025-10-14 10:05:14.881348193 +0000 UTC m=+0.071533203 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Oct 14 06:05:14 localhost ceph-mon[303906]: [14/Oct/2025:10:05:13] ENGINE Bus STARTING Oct 14 06:05:14 localhost ceph-mon[303906]: [14/Oct/2025:10:05:14] ENGINE Serving on http://172.18.0.107:8765 Oct 14 06:05:14 localhost ceph-mon[303906]: [14/Oct/2025:10:05:14] ENGINE Serving on https://172.18.0.107:7150 Oct 14 06:05:14 localhost ceph-mon[303906]: [14/Oct/2025:10:05:14] ENGINE Bus STARTED Oct 14 06:05:14 localhost ceph-mon[303906]: [14/Oct/2025:10:05:14] ENGINE Client ('172.18.0.107', 44626) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:05:14 localhost ceph-mon[303906]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Oct 14 06:05:14 localhost ceph-mon[303906]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Oct 14 06:05:14 localhost ceph-mon[303906]: Cluster is now healthy Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:14 localhost podman[312471]: 2025-10-14 10:05:14.922170763 +0000 UTC m=+0.112355843 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:05:14 localhost nova_compute[297686]: 2025-10-14 10:05:14.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:14 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:05:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:05:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:05:16 localhost podman[312597]: 2025-10-14 10:05:16.445102966 +0000 UTC m=+0.100186810 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:05:16 localhost podman[312597]: 2025-10-14 10:05:16.482270545 +0000 UTC m=+0.137354359 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:05:16 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:05:16 localhost podman[312596]: 2025-10-14 10:05:16.489738414 +0000 UTC m=+0.146873981 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 06:05:16 localhost podman[312596]: 2025-10-14 10:05:16.574306344 +0000 UTC m=+0.231441921 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd) Oct 14 06:05:16 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:05:16 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd/host:np0005486730", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:05:16 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:05:16 localhost ceph-mon[303906]: Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:05:16 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:05:16 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:16 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:17 localhost nova_compute[297686]: 2025-10-14 10:05:17.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:17 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:17 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:17 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:17 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:17 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:17 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:18 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:18 localhost nova_compute[297686]: 2025-10-14 10:05:18.720 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:18 localhost nova_compute[297686]: 2025-10-14 10:05:18.720 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:19 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:05:19 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:05:19 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:05:19 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:05:19 localhost ceph-mon[303906]: Updating np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:19 localhost nova_compute[297686]: 2025-10-14 10:05:19.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:19 localhost nova_compute[297686]: 2025-10-14 10:05:19.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:05:19 localhost nova_compute[297686]: 2025-10-14 10:05:19.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:20 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:05:20 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:05:20 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:05:20 localhost ceph-mon[303906]: Health check failed: 2 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Oct 14 06:05:20 localhost ceph-mon[303906]: Health check failed: 2 stray host(s) with 2 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Oct 14 06:05:20 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:20 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486730.ddfidc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:20 localhost nova_compute[297686]: 2025-10-14 10:05:20.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:20 localhost nova_compute[297686]: 2025-10-14 10:05:20.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:05:20 localhost nova_compute[297686]: 2025-10-14 10:05:20.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:05:21 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486730.ddfidc (monmap changed)... Oct 14 06:05:21 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486730.ddfidc on np0005486730.localdomain Oct 14 06:05:21 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:21 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:21 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:21 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:21 localhost nova_compute[297686]: 2025-10-14 10:05:21.332 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:05:21 localhost nova_compute[297686]: 2025-10-14 10:05:21.332 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:05:21 localhost nova_compute[297686]: 2025-10-14 10:05:21.332 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:05:21 localhost nova_compute[297686]: 2025-10-14 10:05:21.333 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:05:22 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:05:22 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:22 localhost ceph-mon[303906]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:22 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:22 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:05:22 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:05:22 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:05:22 localhost nova_compute[297686]: 2025-10-14 10:05:22.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:23 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:23 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:23 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:23 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:23 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:23 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:23 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:05:23 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:05:23 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.501 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.526 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.526 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.527 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.528 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.528 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.529 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.530 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.577 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.577 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.578 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.578 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.579 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:05:24 localhost nova_compute[297686]: 2025-10-14 10:05:24.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.054 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:25 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:05:25 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:25 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.131 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.132 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.314 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.315 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11392MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.315 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.315 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.391 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.392 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.392 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.451 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:05:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:05:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:05:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:05:25 localhost systemd[1]: tmp-crun.o75SSd.mount: Deactivated successfully. Oct 14 06:05:25 localhost podman[313321]: 2025-10-14 10:05:25.78454355 +0000 UTC m=+0.115056415 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS) Oct 14 06:05:25 localhost podman[313320]: 2025-10-14 10:05:25.790569385 +0000 UTC m=+0.128569580 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, vendor=Red Hat, Inc.) Oct 14 06:05:25 localhost podman[313320]: 2025-10-14 10:05:25.802074277 +0000 UTC m=+0.140074432 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:05:25 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:05:25 localhost podman[313321]: 2025-10-14 10:05:25.825107962 +0000 UTC m=+0.155620847 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute) Oct 14 06:05:25 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:05:25 localhost podman[313319]: 2025-10-14 10:05:25.752286522 +0000 UTC m=+0.089419251 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:05:25 localhost podman[313319]: 2025-10-14 10:05:25.887078381 +0000 UTC m=+0.224211080 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Oct 14 06:05:25 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:05:25 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:05:25 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2967300860' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.935 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.941 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.958 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.961 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:05:25 localhost nova_compute[297686]: 2025-10-14 10:05:25.961 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:05:26 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:05:26 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:05:26 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:26 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:26 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:26 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:27 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:05:27 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:05:27 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:27 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:27 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:27 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:05:27 localhost nova_compute[297686]: 2025-10-14 10:05:27.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:28 localhost ceph-mon[303906]: Saving service mon spec with placement label:mon Oct 14 06:05:28 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:05:28 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:05:28 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:28 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:28 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:28 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:28 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:05:28 localhost podman[248187]: time="2025-10-14T10:05:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:05:28 localhost podman[248187]: @ - - [14/Oct/2025:10:05:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:05:28 localhost podman[248187]: @ - - [14/Oct/2025:10:05:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19841 "" "Go-http-client/1.1" Oct 14 06:05:28 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:29 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:05:29 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:05:29 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:29 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:29 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:29 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:29 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:05:29 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:05:29 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Oct 14 06:05:29 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/915326867' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Oct 14 06:05:30 localhost nova_compute[297686]: 2025-10-14 10:05:30.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:30 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:05:30 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:05:30 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:30 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:30 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:30 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:31 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:05:31 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:05:31 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:31 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:31 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:05:32 localhost podman[313435]: Oct 14 06:05:32 localhost podman[313435]: 2025-10-14 10:05:32.133562615 +0000 UTC m=+0.059291117 container create 43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_feynman, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, release=553, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.buildah.version=1.33.12) Oct 14 06:05:32 localhost systemd[1]: Started libpod-conmon-43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81.scope. Oct 14 06:05:32 localhost systemd[1]: Started libcrun container. Oct 14 06:05:32 localhost podman[313435]: 2025-10-14 10:05:32.191789969 +0000 UTC m=+0.117518451 container init 43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_feynman, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:05:32 localhost podman[313435]: 2025-10-14 10:05:32.101365749 +0000 UTC m=+0.027094281 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:32 localhost systemd[1]: tmp-crun.dopXaT.mount: Deactivated successfully. Oct 14 06:05:32 localhost podman[313435]: 2025-10-14 10:05:32.205989274 +0000 UTC m=+0.131717806 container start 43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_feynman, com.redhat.component=rhceph-container, vcs-type=git, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, GIT_BRANCH=main) Oct 14 06:05:32 localhost podman[313435]: 2025-10-14 10:05:32.206424847 +0000 UTC m=+0.132153359 container attach 43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_feynman, vcs-type=git, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:05:32 localhost zen_feynman[313450]: 167 167 Oct 14 06:05:32 localhost systemd[1]: libpod-43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81.scope: Deactivated successfully. Oct 14 06:05:32 localhost podman[313435]: 2025-10-14 10:05:32.215600248 +0000 UTC m=+0.141328810 container died 43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_feynman, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7) Oct 14 06:05:32 localhost ceph-mon[303906]: Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:05:32 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:05:32 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:32 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:32 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:32 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:32 localhost podman[313455]: 2025-10-14 10:05:32.318594643 +0000 UTC m=+0.089548354 container remove 43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_feynman, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:05:32 localhost systemd[1]: libpod-conmon-43a575799a30bb4a6ae6f03376879212ff2d6eaa86868874d1628c0f8f7cdb81.scope: Deactivated successfully. Oct 14 06:05:32 localhost nova_compute[297686]: 2025-10-14 10:05:32.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:33 localhost podman[313523]: Oct 14 06:05:33 localhost podman[313523]: 2025-10-14 10:05:33.0777897 +0000 UTC m=+0.059974778 container create 51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_cartwright, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_BRANCH=main, release=553, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Oct 14 06:05:33 localhost systemd[1]: Started libpod-conmon-51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2.scope. Oct 14 06:05:33 localhost systemd[1]: Started libcrun container. Oct 14 06:05:33 localhost systemd[1]: var-lib-containers-storage-overlay-b529dcae84d4272a27da2586eddbd70a65c44235a128d9ea10ca8066c5131293-merged.mount: Deactivated successfully. Oct 14 06:05:33 localhost podman[313523]: 2025-10-14 10:05:33.143504204 +0000 UTC m=+0.125689292 container init 51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_cartwright, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., release=553, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:05:33 localhost podman[313523]: 2025-10-14 10:05:33.046905964 +0000 UTC m=+0.029091062 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:33 localhost podman[313523]: 2025-10-14 10:05:33.153400586 +0000 UTC m=+0.135585664 container start 51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_cartwright, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main) Oct 14 06:05:33 localhost podman[313523]: 2025-10-14 10:05:33.153753167 +0000 UTC m=+0.135938295 container attach 51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_cartwright, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Oct 14 06:05:33 localhost lucid_cartwright[313538]: 167 167 Oct 14 06:05:33 localhost systemd[1]: libpod-51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2.scope: Deactivated successfully. Oct 14 06:05:33 localhost podman[313523]: 2025-10-14 10:05:33.157349858 +0000 UTC m=+0.139534976 container died 51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_cartwright, CEPH_POINT_RELEASE=, release=553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:05:33 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:05:33 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:05:33 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:33 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:33 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:05:33 localhost podman[313543]: 2025-10-14 10:05:33.26484146 +0000 UTC m=+0.096604260 container remove 51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_cartwright, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, release=553, name=rhceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main) Oct 14 06:05:33 localhost systemd[1]: libpod-conmon-51771cfd506e9ee78f9cab68a99ff6d95a71f22ceaa191159137b2ef84a62ae2.scope: Deactivated successfully. Oct 14 06:05:33 localhost ceph-mon[303906]: mon.np0005486733@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:34 localhost podman[313620]: Oct 14 06:05:34 localhost podman[313620]: 2025-10-14 10:05:34.071616375 +0000 UTC m=+0.042665638 container create 98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bhabha, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:05:34 localhost systemd[1]: Started libpod-conmon-98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4.scope. Oct 14 06:05:34 localhost systemd[1]: Started libcrun container. Oct 14 06:05:34 localhost podman[313620]: 2025-10-14 10:05:34.12203141 +0000 UTC m=+0.093080633 container init 98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bhabha, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main, version=7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, architecture=x86_64) Oct 14 06:05:34 localhost podman[313620]: 2025-10-14 10:05:34.130926702 +0000 UTC m=+0.101975925 container start 98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bhabha, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=) Oct 14 06:05:34 localhost podman[313620]: 2025-10-14 10:05:34.131119808 +0000 UTC m=+0.102169071 container attach 98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bhabha, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , ceph=True, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:05:34 localhost elegant_bhabha[313635]: 167 167 Oct 14 06:05:34 localhost systemd[1]: libpod-98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4.scope: Deactivated successfully. Oct 14 06:05:34 localhost podman[313620]: 2025-10-14 10:05:34.1354294 +0000 UTC m=+0.106478623 container died 98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bhabha, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, release=553) Oct 14 06:05:34 localhost systemd[1]: var-lib-containers-storage-overlay-a28f21747790e735914f2871beaf0522b9ef4a8916858cca9bf2903d41ab4bf8-merged.mount: Deactivated successfully. Oct 14 06:05:34 localhost podman[313620]: 2025-10-14 10:05:34.053612763 +0000 UTC m=+0.024661986 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:34 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e12 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0) Oct 14 06:05:34 localhost ceph-mon[303906]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/53935379' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch Oct 14 06:05:34 localhost systemd[1]: var-lib-containers-storage-overlay-48506e04cd5727ad6cbf4e70e773228891c2a5eacffc317a4b978098012f05b2-merged.mount: Deactivated successfully. Oct 14 06:05:34 localhost podman[313640]: 2025-10-14 10:05:34.212106869 +0000 UTC m=+0.067552851 container remove 98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bhabha, io.openshift.expose-services=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, name=rhceph, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64) Oct 14 06:05:34 localhost systemd[1]: libpod-conmon-98018b2caed04aa9f492fd28dc5b4655600d7f0987aa31c78e1054a398b28fb4.scope: Deactivated successfully. Oct 14 06:05:34 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:05:34 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:05:34 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:34 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:34 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:34 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:34 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:05:35 localhost podman[313716]: Oct 14 06:05:35 localhost podman[313716]: 2025-10-14 10:05:35.03149233 +0000 UTC m=+0.054919114 container create dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_thompson, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc.) Oct 14 06:05:35 localhost nova_compute[297686]: 2025-10-14 10:05:35.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:35 localhost systemd[1]: Started libpod-conmon-dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d.scope. Oct 14 06:05:35 localhost systemd[1]: Started libcrun container. Oct 14 06:05:35 localhost podman[313716]: 2025-10-14 10:05:35.102591748 +0000 UTC m=+0.126018542 container init dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_thompson, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=553, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:05:35 localhost podman[313716]: 2025-10-14 10:05:35.007745962 +0000 UTC m=+0.031172726 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:35 localhost podman[313716]: 2025-10-14 10:05:35.109281152 +0000 UTC m=+0.132707926 container start dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_thompson, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, architecture=x86_64, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:05:35 localhost podman[313716]: 2025-10-14 10:05:35.109498789 +0000 UTC m=+0.132925613 container attach dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_thompson, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , ceph=True, version=7, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Oct 14 06:05:35 localhost blissful_thompson[313730]: 167 167 Oct 14 06:05:35 localhost systemd[1]: libpod-dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d.scope: Deactivated successfully. Oct 14 06:05:35 localhost podman[313716]: 2025-10-14 10:05:35.131237325 +0000 UTC m=+0.154664119 container died dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_thompson, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, RELEASE=main) Oct 14 06:05:35 localhost systemd[1]: var-lib-containers-storage-overlay-1b9643fcb06a5211a914aac7d381e73c2cd9f3cc064a51fabd9068400584e1f7-merged.mount: Deactivated successfully. Oct 14 06:05:35 localhost podman[313737]: 2025-10-14 10:05:35.233730905 +0000 UTC m=+0.089193823 container remove dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_thompson, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc.) Oct 14 06:05:35 localhost systemd[1]: libpod-conmon-dab0a7bce081229b6ed45780b0abeea1a892d669bfce12c11f708154d4bb850d.scope: Deactivated successfully. Oct 14 06:05:35 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:05:35 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:05:35 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:35 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:35 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:35 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:35 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:05:35 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:05:35 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:05:35 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.671141) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436335671220, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1522, "num_deletes": 260, "total_data_size": 6663931, "memory_usage": 6907984, "flush_reason": "Manual Compaction"} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436335683540, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4066609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19270, "largest_seqno": 20787, "table_properties": {"data_size": 4059819, "index_size": 3679, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17022, "raw_average_key_size": 21, "raw_value_size": 4045169, "raw_average_value_size": 5088, "num_data_blocks": 154, "num_entries": 795, "num_filter_entries": 795, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436312, "oldest_key_time": 1760436312, "file_creation_time": 1760436335, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 12447 microseconds, and 5156 cpu microseconds. Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.683594) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4066609 bytes OK Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.683617) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.685663) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.685690) EVENT_LOG_v1 {"time_micros": 1760436335685685, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.685711) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 6656017, prev total WAL file size 6656017, number of live WAL files 2. Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.686578) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323632' seq:72057594037927935, type:22 .. '6B760031353230' seq:0, type:0; will stop at (end) Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(3971KB)], [30(13MB)] Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436335686655, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 18060869, "oldest_snapshot_seqno": -1} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11166 keys, 17010616 bytes, temperature: kUnknown Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436335784348, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17010616, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16947802, "index_size": 33785, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27973, "raw_key_size": 300103, "raw_average_key_size": 26, "raw_value_size": 16758049, "raw_average_value_size": 1500, "num_data_blocks": 1270, "num_entries": 11166, "num_filter_entries": 11166, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436335, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.784801) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17010616 bytes Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.786523) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.7 rd, 174.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 13.3 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(8.6) write-amplify(4.2) OK, records in: 11692, records dropped: 526 output_compression: NoCompression Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.786553) EVENT_LOG_v1 {"time_micros": 1760436335786539, "job": 16, "event": "compaction_finished", "compaction_time_micros": 97780, "compaction_time_cpu_micros": 53716, "output_level": 6, "num_output_files": 1, "total_output_size": 17010616, "num_input_records": 11692, "num_output_records": 11166, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436335787286, "job": 16, "event": "table_file_deletion", "file_number": 32} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436335789872, "job": 16, "event": "table_file_deletion", "file_number": 30} Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.686502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.789915) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.789922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.789925) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.789928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:35 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:05:35.789931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:05:35 localhost podman[313805]: Oct 14 06:05:35 localhost podman[313805]: 2025-10-14 10:05:35.99089184 +0000 UTC m=+0.078031092 container create 8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_dubinsky, io.openshift.expose-services=, vcs-type=git, version=7, ceph=True, io.buildah.version=1.33.12, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Oct 14 06:05:36 localhost systemd[1]: Started libpod-conmon-8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97.scope. Oct 14 06:05:36 localhost systemd[1]: Started libcrun container. Oct 14 06:05:36 localhost podman[313805]: 2025-10-14 10:05:35.956639821 +0000 UTC m=+0.043779133 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:36 localhost podman[313805]: 2025-10-14 10:05:36.059601305 +0000 UTC m=+0.146740647 container init 8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_dubinsky, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , version=7, ceph=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=) Oct 14 06:05:36 localhost podman[313805]: 2025-10-14 10:05:36.072073217 +0000 UTC m=+0.159212479 container start 8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_dubinsky, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55) Oct 14 06:05:36 localhost podman[313805]: 2025-10-14 10:05:36.072389286 +0000 UTC m=+0.159528588 container attach 8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_dubinsky, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553) Oct 14 06:05:36 localhost distracted_dubinsky[313820]: 167 167 Oct 14 06:05:36 localhost systemd[1]: libpod-8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97.scope: Deactivated successfully. Oct 14 06:05:36 localhost podman[313805]: 2025-10-14 10:05:36.07906591 +0000 UTC m=+0.166205172 container died 8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_dubinsky, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:05:36 localhost systemd[1]: var-lib-containers-storage-overlay-c1b9933372e39730bfbe182183b97e73a14192a88534d7e5e21fec5ab03a6b38-merged.mount: Deactivated successfully. Oct 14 06:05:36 localhost podman[313825]: 2025-10-14 10:05:36.186845473 +0000 UTC m=+0.097511529 container remove 8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_dubinsky, maintainer=Guillaume Abrioux , version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, release=553, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12) Oct 14 06:05:36 localhost systemd[1]: libpod-conmon-8043ae0a35642af29b521489061e0bac9719b673865001d58439318bafc14d97.scope: Deactivated successfully. Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:36 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:05:36 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:36 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:05:36 localhost podman[313894]: Oct 14 06:05:36 localhost podman[313894]: 2025-10-14 10:05:36.950669022 +0000 UTC m=+0.083050715 container create c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bohr, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, GIT_CLEAN=True, name=rhceph) Oct 14 06:05:36 localhost systemd[1]: Started libpod-conmon-c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f.scope. Oct 14 06:05:37 localhost systemd[1]: Started libcrun container. Oct 14 06:05:37 localhost podman[313894]: 2025-10-14 10:05:36.918429504 +0000 UTC m=+0.050811237 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:05:37 localhost podman[313894]: 2025-10-14 10:05:37.027384522 +0000 UTC m=+0.159766215 container init c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bohr, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph) Oct 14 06:05:37 localhost podman[313894]: 2025-10-14 10:05:37.037074628 +0000 UTC m=+0.169456331 container start c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bohr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, version=7, build-date=2025-09-24T08:57:55, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:05:37 localhost podman[313894]: 2025-10-14 10:05:37.037362778 +0000 UTC m=+0.169744521 container attach c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bohr, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, release=553, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:05:37 localhost sad_bohr[313909]: 167 167 Oct 14 06:05:37 localhost systemd[1]: libpod-c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f.scope: Deactivated successfully. Oct 14 06:05:37 localhost podman[313894]: 2025-10-14 10:05:37.039588205 +0000 UTC m=+0.171969948 container died c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bohr, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:05:37 localhost systemd[1]: var-lib-containers-storage-overlay-891b1f6dd40cb251f54f1909b55dbf8a34d685ceb2969d0412e56686e920c6dc-merged.mount: Deactivated successfully. Oct 14 06:05:37 localhost podman[313914]: 2025-10-14 10:05:37.142983923 +0000 UTC m=+0.090979808 container remove c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bohr, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=553, name=rhceph, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:05:37 localhost systemd[1]: libpod-conmon-c6362314c7670d3d57ff870063e2fe4b1442b44088cfcfd8393d16950f8fb22f.scope: Deactivated successfully. Oct 14 06:05:37 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da4e7fa420 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Oct 14 06:05:37 localhost ceph-mon[303906]: mon.np0005486733@1(peon) e13 my rank is now 0 (was 1) Oct 14 06:05:37 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Oct 14 06:05:37 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Oct 14 06:05:37 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97080 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Oct 14 06:05:37 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:05:37 localhost ceph-mon[303906]: paxos.0).electionLogic(52) init, last seen epoch 52 Oct 14 06:05:37 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:37 localhost nova_compute[297686]: 2025-10-14 10:05:37.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:38 localhost openstack_network_exporter[250374]: ERROR 10:05:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:05:38 localhost openstack_network_exporter[250374]: ERROR 10:05:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:05:38 localhost openstack_network_exporter[250374]: ERROR 10:05:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:05:38 localhost openstack_network_exporter[250374]: ERROR 10:05:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:05:38 localhost openstack_network_exporter[250374]: Oct 14 06:05:38 localhost openstack_network_exporter[250374]: ERROR 10:05:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:05:38 localhost openstack_network_exporter[250374]: Oct 14 06:05:40 localhost nova_compute[297686]: 2025-10-14 10:05:40.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:05:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:05:40 localhost podman[313931]: 2025-10-14 10:05:40.423214919 +0000 UTC m=+0.085697026 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:05:40 localhost podman[313932]: 2025-10-14 10:05:40.476983706 +0000 UTC m=+0.137059590 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:05:40 localhost podman[313932]: 2025-10-14 10:05:40.507863312 +0000 UTC m=+0.167939176 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible) Oct 14 06:05:40 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:05:40 localhost podman[313931]: 2025-10-14 10:05:40.559103222 +0000 UTC m=+0.221585279 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:05:40 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 is new leader, mons np0005486733,np0005486731 in quorum (ranks 0,1) Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : monmap epoch 13 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : last_changed 2025-10-14T10:05:37.254667+0000 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : created 2025-10-14T07:49:51.150761+0000 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : election_strategy: 1 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005486733 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005486731 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005486732 Oct 14 06:05:42 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005486732.xkownj=up:active} 2 up:standby Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : osdmap e82: 6 total, 6 up, 6 in Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : mgrmap e30: np0005486732.pasqzz(active, since 30s), standbys: np0005486733.primvu, np0005486728.giajub, np0005486729.xpybho, np0005486730.ddfidc, np0005486731.swasqz Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005486733,np0005486731 (MON_DOWN) Oct 14 06:05:42 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005486733,np0005486731 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005486733,np0005486731 Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : mon.np0005486732 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Oct 14 06:05:42 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:42 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:05:42 localhost ceph-mon[303906]: Remove daemons mon.np0005486730 Oct 14 06:05:42 localhost ceph-mon[303906]: Safe to remove mon.np0005486730: new quorum should be ['np0005486733', 'np0005486731', 'np0005486732'] (from ['np0005486733', 'np0005486731', 'np0005486732']) Oct 14 06:05:42 localhost ceph-mon[303906]: Removing monitor np0005486730 from monmap... Oct 14 06:05:42 localhost ceph-mon[303906]: Removing daemon mon.np0005486730 from np0005486730.localdomain -- ports [] Oct 14 06:05:42 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:05:42 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:05:42 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:05:42 localhost ceph-mon[303906]: mon.np0005486733 is new leader, mons np0005486733,np0005486731 in quorum (ranks 0,1) Oct 14 06:05:42 localhost ceph-mon[303906]: Health check failed: 1/3 mons down, quorum np0005486733,np0005486731 (MON_DOWN) Oct 14 06:05:42 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005486733,np0005486731 Oct 14 06:05:42 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:42 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:42 localhost ceph-mon[303906]: stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:42 localhost ceph-mon[303906]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005486733,np0005486731 Oct 14 06:05:42 localhost ceph-mon[303906]: mon.np0005486732 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Oct 14 06:05:42 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:42 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:05:42 localhost nova_compute[297686]: 2025-10-14 10:05:42.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:43 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:43 localhost ceph-mon[303906]: Deploying daemon mon.np0005486730 on np0005486730.localdomain Oct 14 06:05:43 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:43 localhost ceph-mon[303906]: mon.np0005486733@0(leader).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:05:43 localhost ceph-mon[303906]: paxos.0).electionLogic(55) init, last seen epoch 55, mid-election, bumping Oct 14 06:05:43 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 is new leader, mons np0005486733,np0005486731,np0005486732 in quorum (ranks 0,1,2) Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : monmap epoch 13 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : last_changed 2025-10-14T10:05:37.254667+0000 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : created 2025-10-14T07:49:51.150761+0000 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : election_strategy: 1 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005486733 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005486731 Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005486732 Oct 14 06:05:43 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005486732.xkownj=up:active} 2 up:standby Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : osdmap e82: 6 total, 6 up, 6 in Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : mgrmap e30: np0005486732.pasqzz(active, since 31s), standbys: np0005486733.primvu, np0005486728.giajub, np0005486729.xpybho, np0005486730.ddfidc, np0005486731.swasqz Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005486733,np0005486731) Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:43 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:44 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "mon rm", "name": "np0005486730"} : dispatch Oct 14 06:05:44 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:05:44 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:05:44 localhost ceph-mon[303906]: Removed label mon from host np0005486730.localdomain Oct 14 06:05:44 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:05:44 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:05:44 localhost ceph-mon[303906]: mon.np0005486733 is new leader, mons np0005486733,np0005486731,np0005486732 in quorum (ranks 0,1,2) Oct 14 06:05:44 localhost ceph-mon[303906]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005486733,np0005486731) Oct 14 06:05:44 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:44 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:44 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:44 localhost ceph-mon[303906]: stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:44 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:44 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:44 localhost ceph-mon[303906]: stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:44 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Oct 14 06:05:44 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:05:45 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:05:45 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:05:45 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost nova_compute[297686]: 2025-10-14 10:05:45.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:05:45 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:05:45 localhost ceph-mon[303906]: Removed label mgr from host np0005486730.localdomain Oct 14 06:05:45 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(leader).monmap v13 adding/updating np0005486730 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster Oct 14 06:05:45 localhost systemd[1]: tmp-crun.HfAW0J.mount: Deactivated successfully. Oct 14 06:05:45 localhost podman[313990]: 2025-10-14 10:05:45.552352614 +0000 UTC m=+0.082265560 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:05:45 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da4e7fa2c0 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Oct 14 06:05:45 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:05:45 localhost ceph-mon[303906]: paxos.0).electionLogic(58) init, last seen epoch 58 Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:45 localhost podman[313990]: 2025-10-14 10:05:45.58419914 +0000 UTC m=+0.114112076 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid) Oct 14 06:05:45 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e14 handle_auth_request failed to assign global_id Oct 14 06:05:45 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e14 handle_auth_request failed to assign global_id Oct 14 06:05:46 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e14 handle_auth_request failed to assign global_id Oct 14 06:05:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:05:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:05:46 localhost podman[314010]: 2025-10-14 10:05:46.734372244 +0000 UTC m=+0.078091633 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:05:46 localhost podman[314010]: 2025-10-14 10:05:46.751656064 +0000 UTC m=+0.095375453 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:05:46 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:05:46 localhost podman[314011]: 2025-10-14 10:05:46.799004734 +0000 UTC m=+0.133852532 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:05:46 localhost podman[314011]: 2025-10-14 10:05:46.810198597 +0000 UTC m=+0.145046355 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:05:46 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:05:47 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e14 handle_auth_request failed to assign global_id Oct 14 06:05:47 localhost nova_compute[297686]: 2025-10-14 10:05:47.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:50 localhost nova_compute[297686]: 2025-10-14 10:05:50.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:50 localhost ceph-mon[303906]: paxos.0).electionLogic(59) init, last seen epoch 59, mid-election, bumping Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 is new leader, mons np0005486733,np0005486731,np0005486732,np0005486730 in quorum (ranks 0,1,2,3) Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : monmap epoch 14 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : last_changed 2025-10-14T10:05:45.537208+0000 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : created 2025-10-14T07:49:51.150761+0000 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : election_strategy: 1 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005486733 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005486731 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005486732 Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005486730 Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005486732.xkownj=up:active} 2 up:standby Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : osdmap e82: 6 total, 6 up, 6 in Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : mgrmap e30: np0005486732.pasqzz(active, since 38s), standbys: np0005486733.primvu, np0005486728.giajub, np0005486729.xpybho, np0005486730.ddfidc, np0005486731.swasqz Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486730 calling monitor election Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733 is new leader, mons np0005486733,np0005486731,np0005486732,np0005486730 in quorum (ranks 0,1,2,3) Oct 14 06:05:50 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:50 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:50 localhost ceph-mon[303906]: stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:50 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:50 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:50 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Oct 14 06:05:50 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:51 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:51 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:05:51 localhost ceph-mon[303906]: Removing daemon mgr.np0005486730.ddfidc from np0005486730.localdomain -- ports [8765] Oct 14 06:05:51 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:51 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:51 localhost ceph-mon[303906]: Removed label _admin from host np0005486730.localdomain Oct 14 06:05:52 localhost nova_compute[297686]: 2025-10-14 10:05:52.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.np0005486730.ddfidc"} v 0) Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth rm", "entity": "mgr.np0005486730.ddfidc"} : dispatch Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005486730.ddfidc"}]': finished Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e14 handle_command mon_command({"prefix": "mon rm", "name": "np0005486730"} v 0) Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "mon rm", "name": "np0005486730"} : dispatch Oct 14 06:05:53 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da4e7fa000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:05:53 localhost ceph-mon[303906]: paxos.0).electionLogic(62) init, last seen epoch 62 Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [INF] : mon.np0005486733 is new leader, mons np0005486733,np0005486731,np0005486732 in quorum (ranks 0,1,2) Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : monmap epoch 15 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : last_changed 2025-10-14T10:05:53.247163+0000 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : created 2025-10-14T07:49:51.150761+0000 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : election_strategy: 1 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005486733 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005486731 Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005486732 Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005486732.xkownj=up:active} 2 up:standby Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : osdmap e82: 6 total, 6 up, 6 in Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [DBG] : mgrmap e30: np0005486732.pasqzz(active, since 41s), standbys: np0005486733.primvu, np0005486728.giajub, np0005486729.xpybho, np0005486730.ddfidc, np0005486731.swasqz Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:53 localhost ceph-mon[303906]: log_channel(cluster) log [WRN] : stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:53 localhost ceph-mon[303906]: mon.np0005486733@0(leader).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:54 localhost ceph-mon[303906]: Safe to remove mon.np0005486730: new quorum should be ['np0005486733', 'np0005486731', 'np0005486732'] (from ['np0005486733', 'np0005486731', 'np0005486732']) Oct 14 06:05:54 localhost ceph-mon[303906]: Removing monitor np0005486730 from monmap... Oct 14 06:05:54 localhost ceph-mon[303906]: Removing daemon mon.np0005486730 from np0005486730.localdomain -- ports [] Oct 14 06:05:54 localhost ceph-mon[303906]: mon.np0005486732 calling monitor election Oct 14 06:05:54 localhost ceph-mon[303906]: mon.np0005486731 calling monitor election Oct 14 06:05:54 localhost ceph-mon[303906]: mon.np0005486733 calling monitor election Oct 14 06:05:54 localhost ceph-mon[303906]: mon.np0005486733 is new leader, mons np0005486733,np0005486731,np0005486732 in quorum (ranks 0,1,2) Oct 14 06:05:54 localhost ceph-mon[303906]: Health detail: HEALTH_WARN 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:54 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:05:54 localhost ceph-mon[303906]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:05:54 localhost ceph-mon[303906]: stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:05:54 localhost ceph-mon[303906]: [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:05:54 localhost ceph-mon[303906]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:05:54 localhost ceph-mon[303906]: stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:05:54 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:05:54 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:54 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:05:54 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:55 localhost nova_compute[297686]: 2025-10-14 10:05:55.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:55 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Oct 14 06:05:55 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:55 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:55 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:55 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:56 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:05:56 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:56 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:05:56 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:05:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:05:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:05:56 localhost podman[314071]: 2025-10-14 10:05:56.769708115 +0000 UTC m=+0.107895995 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Oct 14 06:05:56 localhost podman[314072]: 2025-10-14 10:05:56.732834226 +0000 UTC m=+0.063603190 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter) Oct 14 06:05:56 localhost podman[314071]: 2025-10-14 10:05:56.793922108 +0000 UTC m=+0.132109968 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:05:56 localhost podman[314073]: 2025-10-14 10:05:56.751811268 +0000 UTC m=+0.081368104 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:05:56 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:05:56 localhost podman[314072]: 2025-10-14 10:05:56.814287271 +0000 UTC m=+0.145056165 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:05:56 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:05:56 localhost podman[314073]: 2025-10-14 10:05:56.83904672 +0000 UTC m=+0.168603566 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:05:56 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:05:56 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:05:56 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:56 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:05:57 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:57 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:57 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:57 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:05:57 localhost ceph-mon[303906]: Removing np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:57 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:57 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:57 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:05:57 localhost ceph-mon[303906]: Removing np0005486730.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:05:57 localhost ceph-mon[303906]: Removing np0005486730.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:05:57 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:57 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:57 localhost ceph-mon[303906]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:05:57.774 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:05:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:05:57.775 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:05:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:05:57.775 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:05:57 localhost nova_compute[297686]: 2025-10-14 10:05:57.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:05:57 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:05:57 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:57 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:05:57 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:05:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:05:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:05:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:05:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:05:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost podman[248187]: time="2025-10-14T10:05:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:05:58 localhost podman[248187]: @ - - [14/Oct/2025:10:05:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:05:58 localhost podman[248187]: @ - - [14/Oct/2025:10:05:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19842 "" "Go-http-client/1.1" Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:05:58 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:05:58 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:58 localhost ceph-mon[303906]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:58 localhost ceph-mon[303906]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:58 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486730.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:59 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain.devices.0}] v 0) Oct 14 06:05:59 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:59 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486730.localdomain}] v 0) Oct 14 06:05:59 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:59 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:05:59 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:59 localhost ceph-mon[303906]: Reconfiguring crash.np0005486730 (monmap changed)... Oct 14 06:05:59 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486730 on np0005486730.localdomain Oct 14 06:05:59 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:59 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:05:59 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:05:59 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486731.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:00 localhost nova_compute[297686]: 2025-10-14 10:06:00.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:00 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:00 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:00 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:00 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:00 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Oct 14 06:06:00 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:01 localhost ceph-mon[303906]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:06:01 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:06:01 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:01 localhost ceph-mon[303906]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:06:01 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:01 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:06:01 localhost ceph-mon[303906]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:06:01 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:01 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:01 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:01 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:01 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:06:02 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:02 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:02 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:06:02 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:02 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Oct 14 06:06:02 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Oct 14 06:06:02 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:02 localhost nova_compute[297686]: 2025-10-14 10:06:02.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:03 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:03 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:03 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:06:03 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:03 localhost ceph-mon[303906]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:06:03 localhost ceph-mon[303906]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:03 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:03 localhost ceph-mon[303906]: mon.np0005486733@0(leader).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.436407) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436363436472, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1114, "num_deletes": 257, "total_data_size": 1104732, "memory_usage": 1127680, "flush_reason": "Manual Compaction"} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436363445364, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 930581, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20792, "largest_seqno": 21901, "table_properties": {"data_size": 925409, "index_size": 2323, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15391, "raw_average_key_size": 22, "raw_value_size": 913365, "raw_average_value_size": 1317, "num_data_blocks": 98, "num_entries": 693, "num_filter_entries": 693, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436336, "oldest_key_time": 1760436336, "file_creation_time": 1760436363, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 9030 microseconds, and 3826 cpu microseconds. Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.445431) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 930581 bytes OK Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.445462) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.447599) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.447624) EVENT_LOG_v1 {"time_micros": 1760436363447617, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.447648) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 1098885, prev total WAL file size 1099209, number of live WAL files 2. Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.448348) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373632' seq:72057594037927935, type:22 .. '6C6F676D0034303135' seq:0, type:0; will stop at (end) Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(908KB)], [33(16MB)] Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436363448397, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 17941197, "oldest_snapshot_seqno": -1} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 11310 keys, 17794311 bytes, temperature: kUnknown Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436363523606, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 17794311, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17729496, "index_size": 35466, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28293, "raw_key_size": 305365, "raw_average_key_size": 26, "raw_value_size": 17536103, "raw_average_value_size": 1550, "num_data_blocks": 1339, "num_entries": 11310, "num_filter_entries": 11310, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436363, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.524038) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 17794311 bytes Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.525515) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.1 rd, 236.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.2 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(38.4) write-amplify(19.1) OK, records in: 11859, records dropped: 549 output_compression: NoCompression Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.525549) EVENT_LOG_v1 {"time_micros": 1760436363525532, "job": 18, "event": "compaction_finished", "compaction_time_micros": 75352, "compaction_time_cpu_micros": 37503, "output_level": 6, "num_output_files": 1, "total_output_size": 17794311, "num_input_records": 11859, "num_output_records": 11310, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436363526033, "job": 18, "event": "table_file_deletion", "file_number": 35} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436363528351, "job": 18, "event": "table_file_deletion", "file_number": 33} Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.448280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.528496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.528504) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.528508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.528511) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:03 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:03.528514) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:04 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:04 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:04 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:04 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:04 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:06:04 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:06:04 localhost ceph-mon[303906]: Added label _no_schedule to host np0005486730.localdomain Oct 14 06:06:04 localhost ceph-mon[303906]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005486730.localdomain Oct 14 06:06:04 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:06:04 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:06:04 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:04 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:06:04 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:04 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:04 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:04 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:04 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:04 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:06:04 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:05 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Oct 14 06:06:05 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:05 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain"} v 0) Oct 14 06:06:05 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain"} : dispatch Oct 14 06:06:05 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain"}]': finished Oct 14 06:06:05 localhost nova_compute[297686]: 2025-10-14 10:06:05.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:05 localhost ceph-mon[303906]: Reconfiguring mon.np0005486731 (monmap changed)... Oct 14 06:06:05 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain"} : dispatch Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain"} : dispatch Oct 14 06:06:05 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain"}]': finished Oct 14 06:06:05 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:05 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:05 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:05 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:06 localhost ceph-mon[303906]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:06:06 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:06:06 localhost ceph-mon[303906]: Removed host np0005486730.localdomain Oct 14 06:06:06 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:06 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:06 localhost ceph-mon[303906]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:06:06 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:06:06 localhost ceph-mon[303906]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:06:06 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:06 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:06 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:06 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:07 localhost sshd[314474]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:06:07 localhost systemd[1]: Created slice User Slice of UID 1003. Oct 14 06:06:07 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Oct 14 06:06:07 localhost systemd-logind[760]: New session 70 of user tripleo-admin. Oct 14 06:06:07 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Oct 14 06:06:07 localhost systemd[1]: Starting User Manager for UID 1003... Oct 14 06:06:07 localhost systemd[314478]: Queued start job for default target Main User Target. Oct 14 06:06:07 localhost systemd[314478]: Created slice User Application Slice. Oct 14 06:06:07 localhost systemd[314478]: Started Mark boot as successful after the user session has run 2 minutes. Oct 14 06:06:07 localhost systemd[314478]: Started Daily Cleanup of User's Temporary Directories. Oct 14 06:06:07 localhost systemd[314478]: Reached target Paths. Oct 14 06:06:07 localhost systemd[314478]: Reached target Timers. Oct 14 06:06:07 localhost systemd[314478]: Starting D-Bus User Message Bus Socket... Oct 14 06:06:07 localhost systemd[314478]: Starting Create User's Volatile Files and Directories... Oct 14 06:06:07 localhost systemd[314478]: Listening on D-Bus User Message Bus Socket. Oct 14 06:06:07 localhost systemd[314478]: Reached target Sockets. Oct 14 06:06:07 localhost systemd[314478]: Finished Create User's Volatile Files and Directories. Oct 14 06:06:07 localhost systemd[314478]: Reached target Basic System. Oct 14 06:06:07 localhost systemd[314478]: Reached target Main User Target. Oct 14 06:06:07 localhost systemd[314478]: Startup finished in 149ms. Oct 14 06:06:07 localhost systemd[1]: Started User Manager for UID 1003. Oct 14 06:06:07 localhost systemd[1]: Started Session 70 of User tripleo-admin. Oct 14 06:06:07 localhost nova_compute[297686]: 2025-10-14 10:06:07.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:07 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:07 localhost ceph-mon[303906]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:06:07 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:06:07 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:07 localhost ceph-mon[303906]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:06:07 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:08 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:08 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:08 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:08 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:06:08 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:08 localhost python3[314620]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.105/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 14 06:06:08 localhost ceph-mon[303906]: mon.np0005486733@0(leader).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:06:08 localhost openstack_network_exporter[250374]: ERROR 10:06:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:06:08 localhost openstack_network_exporter[250374]: ERROR 10:06:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:06:08 localhost openstack_network_exporter[250374]: ERROR 10:06:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:06:08 localhost openstack_network_exporter[250374]: ERROR 10:06:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:06:08 localhost openstack_network_exporter[250374]: Oct 14 06:06:08 localhost openstack_network_exporter[250374]: ERROR 10:06:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:06:08 localhost openstack_network_exporter[250374]: Oct 14 06:06:08 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:08 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:08 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:08 localhost python3[314766]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.105/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 06:06:08 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:08 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:06:08 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:09 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:09 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:09 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:09 localhost python3[314911]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.105 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 06:06:09 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:09 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:09 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:09 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:10 localhost ceph-mon[303906]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:06:10 localhost ceph-mon[303906]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:06:10 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:10 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:10 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:06:10 localhost nova_compute[297686]: 2025-10-14 10:06:10.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:06:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:06:10 localhost podman[314914]: 2025-10-14 10:06:10.755473555 +0000 UTC m=+0.088135892 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:06:10 localhost podman[314914]: 2025-10-14 10:06:10.760183538 +0000 UTC m=+0.092845855 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent) Oct 14 06:06:10 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:10 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:10 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:06:10 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:10 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:10 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:06:10 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:10 localhost podman[314913]: 2025-10-14 10:06:10.849932028 +0000 UTC m=+0.183012418 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:06:10 localhost podman[314913]: 2025-10-14 10:06:10.88331321 +0000 UTC m=+0.216393680 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:06:10 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:06:11 localhost ceph-mon[303906]: Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:06:11 localhost ceph-mon[303906]: Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:06:11 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:11 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:11 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:11 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:11 localhost podman[315008]: Oct 14 06:06:11 localhost podman[315008]: 2025-10-14 10:06:11.455584952 +0000 UTC m=+0.055990796 container create a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_hertz, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:06:11 localhost systemd[1]: Started libpod-conmon-a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87.scope. Oct 14 06:06:11 localhost systemd[1]: Started libcrun container. Oct 14 06:06:11 localhost podman[315008]: 2025-10-14 10:06:11.433019181 +0000 UTC m=+0.033425035 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:11 localhost podman[315008]: 2025-10-14 10:06:11.581557841 +0000 UTC m=+0.181963685 container init a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_hertz, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph) Oct 14 06:06:11 localhost podman[315008]: 2025-10-14 10:06:11.59588505 +0000 UTC m=+0.196290894 container start a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_hertz, release=553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=) Oct 14 06:06:11 localhost podman[315008]: 2025-10-14 10:06:11.596137627 +0000 UTC m=+0.196543491 container attach a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_hertz, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Oct 14 06:06:11 localhost hungry_hertz[315023]: 167 167 Oct 14 06:06:11 localhost systemd[1]: libpod-a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87.scope: Deactivated successfully. Oct 14 06:06:11 localhost podman[315008]: 2025-10-14 10:06:11.60372546 +0000 UTC m=+0.204131334 container died a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_hertz, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git) Oct 14 06:06:11 localhost podman[315028]: 2025-10-14 10:06:11.704881059 +0000 UTC m=+0.089847814 container remove a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_hertz, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vendor=Red Hat, Inc.) Oct 14 06:06:11 localhost systemd[1]: libpod-conmon-a8840a07051420c0e56c4e6fe2b204a59509779c9a81c7af54c66c8ab19c3d87.scope: Deactivated successfully. Oct 14 06:06:11 localhost systemd[1]: var-lib-containers-storage-overlay-4bd78442dd0966babc462291841e23e06df15f962a42293f49bd643300db183f-merged.mount: Deactivated successfully. Oct 14 06:06:11 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:11 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:11 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:11 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:12 localhost ceph-mon[303906]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:06:12 localhost ceph-mon[303906]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:06:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:12 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:12 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:06:12 localhost podman[315114]: Oct 14 06:06:12 localhost podman[315114]: 2025-10-14 10:06:12.451466779 +0000 UTC m=+0.078220327 container create b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mestorf, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:06:12 localhost systemd[1]: Started libpod-conmon-b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329.scope. Oct 14 06:06:12 localhost systemd[1]: Started libcrun container. Oct 14 06:06:12 localhost podman[315114]: 2025-10-14 10:06:12.515824041 +0000 UTC m=+0.142577599 container init b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mestorf, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:06:12 localhost podman[315114]: 2025-10-14 10:06:12.419721237 +0000 UTC m=+0.046474815 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:12 localhost podman[315114]: 2025-10-14 10:06:12.527793587 +0000 UTC m=+0.154547145 container start b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mestorf, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_BRANCH=main, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git) Oct 14 06:06:12 localhost peaceful_mestorf[315130]: 167 167 Oct 14 06:06:12 localhost podman[315114]: 2025-10-14 10:06:12.529110508 +0000 UTC m=+0.155864056 container attach b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mestorf, RELEASE=main, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public) Oct 14 06:06:12 localhost systemd[1]: libpod-b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329.scope: Deactivated successfully. Oct 14 06:06:12 localhost podman[315114]: 2025-10-14 10:06:12.533907935 +0000 UTC m=+0.160661503 container died b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mestorf, name=rhceph, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7) Oct 14 06:06:12 localhost podman[315135]: 2025-10-14 10:06:12.61240925 +0000 UTC m=+0.072192343 container remove b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_mestorf, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Oct 14 06:06:12 localhost systemd[1]: libpod-conmon-b896a2e33920652378bca10288bd6ff2beca3021e86b947bd292089cb20f7329.scope: Deactivated successfully. Oct 14 06:06:12 localhost systemd[1]: var-lib-containers-storage-overlay-d26fc96e301a1cda266faa61582dca3f22f262a4e20251f3f9fd2c8a4bb6cb26-merged.mount: Deactivated successfully. Oct 14 06:06:12 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:12 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:12 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:12 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:12 localhost nova_compute[297686]: 2025-10-14 10:06:12.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:12 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:06:12 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:13 localhost ceph-mon[303906]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:06:13 localhost ceph-mon[303906]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:06:13 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:13 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:06:13 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:13 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:13 localhost podman[315211]: Oct 14 06:06:13 localhost podman[315211]: 2025-10-14 10:06:13.396267342 +0000 UTC m=+0.040125300 container create b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_blackwell, name=rhceph, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, RELEASE=main, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public) Oct 14 06:06:13 localhost systemd[1]: Started libpod-conmon-b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded.scope. Oct 14 06:06:13 localhost ceph-mon[303906]: mon.np0005486733@0(leader).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:06:13 localhost systemd[1]: Started libcrun container. Oct 14 06:06:13 localhost podman[315211]: 2025-10-14 10:06:13.451502644 +0000 UTC m=+0.095360642 container init b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_blackwell, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:06:13 localhost podman[315211]: 2025-10-14 10:06:13.459587312 +0000 UTC m=+0.103445270 container start b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_blackwell, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, ceph=True, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:06:13 localhost podman[315211]: 2025-10-14 10:06:13.459872581 +0000 UTC m=+0.103730569 container attach b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_blackwell, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git) Oct 14 06:06:13 localhost affectionate_blackwell[315226]: 167 167 Oct 14 06:06:13 localhost systemd[1]: libpod-b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded.scope: Deactivated successfully. Oct 14 06:06:13 localhost podman[315211]: 2025-10-14 10:06:13.461754319 +0000 UTC m=+0.105612317 container died b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_blackwell, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55) Oct 14 06:06:13 localhost podman[315211]: 2025-10-14 10:06:13.381019175 +0000 UTC m=+0.024877153 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:13 localhost podman[315231]: 2025-10-14 10:06:13.531482305 +0000 UTC m=+0.062349342 container remove b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_blackwell, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64) Oct 14 06:06:13 localhost systemd[1]: libpod-conmon-b168e82f6c7b38fe14910abd10c3aadfcff31e572f4555d3a9de857be1938ded.scope: Deactivated successfully. Oct 14 06:06:13 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:13 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:13 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:13 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:13 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:06:13 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:13 localhost systemd[1]: tmp-crun.r7z5OL.mount: Deactivated successfully. Oct 14 06:06:13 localhost systemd[1]: var-lib-containers-storage-overlay-37b59c5c37543f6a9083e6262171cdfdd11e984cb83d502b81afea31c3bf4c14-merged.mount: Deactivated successfully. Oct 14 06:06:14 localhost ceph-mon[303906]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:06:14 localhost ceph-mon[303906]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:06:14 localhost ceph-mon[303906]: Saving service mon spec with placement label:mon Oct 14 06:06:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:14 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:14 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:14 localhost nova_compute[297686]: 2025-10-14 10:06:14.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:14 localhost nova_compute[297686]: 2025-10-14 10:06:14.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 06:06:14 localhost nova_compute[297686]: 2025-10-14 10:06:14.271 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 06:06:14 localhost podman[315309]: Oct 14 06:06:14 localhost podman[315309]: 2025-10-14 10:06:14.30564436 +0000 UTC m=+0.066518839 container create c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_proskuriakova, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:06:14 localhost systemd[1]: Started libpod-conmon-c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19.scope. Oct 14 06:06:14 localhost systemd[1]: Started libcrun container. Oct 14 06:06:14 localhost podman[315309]: 2025-10-14 10:06:14.361316725 +0000 UTC m=+0.122191214 container init c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_proskuriakova, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:06:14 localhost podman[315309]: 2025-10-14 10:06:14.371539929 +0000 UTC m=+0.132414438 container start c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_proskuriakova, io.buildah.version=1.33.12, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7) Oct 14 06:06:14 localhost podman[315309]: 2025-10-14 10:06:14.371839868 +0000 UTC m=+0.132714377 container attach c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_proskuriakova, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7) Oct 14 06:06:14 localhost romantic_proskuriakova[315326]: 167 167 Oct 14 06:06:14 localhost systemd[1]: libpod-c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19.scope: Deactivated successfully. Oct 14 06:06:14 localhost podman[315309]: 2025-10-14 10:06:14.375223202 +0000 UTC m=+0.136097691 container died c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_proskuriakova, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vcs-type=git, release=553, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.33.12, name=rhceph, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:06:14 localhost podman[315309]: 2025-10-14 10:06:14.279395536 +0000 UTC m=+0.040270065 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:14 localhost podman[315331]: 2025-10-14 10:06:14.445133593 +0000 UTC m=+0.062828206 container remove c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_proskuriakova, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, version=7, maintainer=Guillaume Abrioux , release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public) Oct 14 06:06:14 localhost systemd[1]: libpod-conmon-c0eff4cbee7e674226539b682d1bda17bd0bb412011c7a3839b19d6ee2c43e19.scope: Deactivated successfully. Oct 14 06:06:14 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:14 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:14 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:14 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:14 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:06:14 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:14 localhost systemd[1]: tmp-crun.1JcWVq.mount: Deactivated successfully. Oct 14 06:06:14 localhost systemd[1]: var-lib-containers-storage-overlay-b91f92c2e273bb600eee905eff3c4dae651a7995fd5f52c61f1a682591f02a9c-merged.mount: Deactivated successfully. Oct 14 06:06:15 localhost ceph-mon[303906]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:06:15 localhost ceph-mon[303906]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:06:15 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:15 localhost ceph-mon[303906]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:15 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:15 localhost ceph-mon[303906]: from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.097037) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436375097094, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 685, "num_deletes": 252, "total_data_size": 685484, "memory_usage": 698312, "flush_reason": "Manual Compaction"} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436375103140, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 596345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21902, "largest_seqno": 22586, "table_properties": {"data_size": 592548, "index_size": 1524, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10166, "raw_average_key_size": 21, "raw_value_size": 584529, "raw_average_value_size": 1246, "num_data_blocks": 63, "num_entries": 469, "num_filter_entries": 469, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436363, "oldest_key_time": 1760436363, "file_creation_time": 1760436375, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 6150 microseconds, and 3004 cpu microseconds. Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.103193) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 596345 bytes OK Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.103217) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.106198) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.106222) EVENT_LOG_v1 {"time_micros": 1760436375106215, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.106246) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 681613, prev total WAL file size 681613, number of live WAL files 2. Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.106890) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(582KB)], [36(16MB)] Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436375106948, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 18390656, "oldest_snapshot_seqno": -1} Oct 14 06:06:15 localhost nova_compute[297686]: 2025-10-14 10:06:15.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:15 localhost podman[315401]: Oct 14 06:06:15 localhost podman[315401]: 2025-10-14 10:06:15.202281597 +0000 UTC m=+0.069438828 container create c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_shirley, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, RELEASE=main, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 11253 keys, 15185806 bytes, temperature: kUnknown Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436375203774, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 15185806, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15123288, "index_size": 33297, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28165, "raw_key_size": 305009, "raw_average_key_size": 27, "raw_value_size": 14932724, "raw_average_value_size": 1326, "num_data_blocks": 1246, "num_entries": 11253, "num_filter_entries": 11253, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436113, "oldest_key_time": 0, "file_creation_time": 1760436375, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84a9ba57-7643-4e19-a68b-d3c5f7942ded", "db_session_id": "ABXPLKSCGGN31DHJT8DW", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.203979) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 15185806 bytes Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.207802) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.2 rd, 157.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 17.0 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(56.3) write-amplify(25.5) OK, records in: 11779, records dropped: 526 output_compression: NoCompression Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.207819) EVENT_LOG_v1 {"time_micros": 1760436375207812, "job": 20, "event": "compaction_finished", "compaction_time_micros": 96691, "compaction_time_cpu_micros": 42885, "output_level": 6, "num_output_files": 1, "total_output_size": 15185806, "num_input_records": 11779, "num_output_records": 11253, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436375208056, "job": 20, "event": "table_file_deletion", "file_number": 38} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436375209380, "job": 20, "event": "table_file_deletion", "file_number": 36} Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.106794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.209667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.209689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.209691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.209693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:15 localhost ceph-mon[303906]: rocksdb: (Original Log Time 2025/10/14-10:06:15.209695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:06:15 localhost systemd[1]: Started libpod-conmon-c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8.scope. Oct 14 06:06:15 localhost systemd[1]: Started libcrun container. Oct 14 06:06:15 localhost podman[315401]: 2025-10-14 10:06:15.162328074 +0000 UTC m=+0.029485335 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:15 localhost podman[315401]: 2025-10-14 10:06:15.270992443 +0000 UTC m=+0.138149664 container init c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_shirley, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Oct 14 06:06:15 localhost laughing_shirley[315416]: 167 167 Oct 14 06:06:15 localhost podman[315401]: 2025-10-14 10:06:15.27940077 +0000 UTC m=+0.146558001 container start c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_shirley, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.expose-services=) Oct 14 06:06:15 localhost podman[315401]: 2025-10-14 10:06:15.279610107 +0000 UTC m=+0.146767328 container attach c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_shirley, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7) Oct 14 06:06:15 localhost systemd[1]: libpod-c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8.scope: Deactivated successfully. Oct 14 06:06:15 localhost podman[315401]: 2025-10-14 10:06:15.280934557 +0000 UTC m=+0.148091768 container died c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_shirley, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.buildah.version=1.33.12, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph) Oct 14 06:06:15 localhost podman[315421]: 2025-10-14 10:06:15.347267799 +0000 UTC m=+0.062046872 container remove c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_shirley, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, ceph=True, vcs-type=git, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7) Oct 14 06:06:15 localhost systemd[1]: libpod-conmon-c98dcb266188e58a8c5463bb352c2b2f5c7c897f546c604a097ecfeed4828de8.scope: Deactivated successfully. Oct 14 06:06:15 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:15 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:15 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:15 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:15 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e15 handle_command mon_command({"prefix": "mon rm", "name": "np0005486733"} v 0) Oct 14 06:06:15 localhost ceph-mon[303906]: log_channel(audit) log [INF] : from='mgr.17415 ' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "mon rm", "name": "np0005486733"} : dispatch Oct 14 06:06:15 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97080 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Oct 14 06:06:15 localhost ceph-mon[303906]: mon.np0005486733@0(leader) e16 removed from monmap, suicide. Oct 14 06:06:15 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Oct 14 06:06:15 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Oct 14 06:06:15 localhost podman[315454]: 2025-10-14 10:06:15.600077413 +0000 UTC m=+0.062196916 container died 294d8462825af3565a6306a21ed744f62ac8682a5af619c3ad636b16a763a986 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, architecture=x86_64, release=553, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, name=rhceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main) Oct 14 06:06:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:06:15 localhost podman[315454]: 2025-10-14 10:06:15.638044107 +0000 UTC m=+0.100163540 container remove 294d8462825af3565a6306a21ed744f62ac8682a5af619c3ad636b16a763a986 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, release=553, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:06:15 localhost ceph-mgr[302471]: --2- 172.18.0.108:0/2728758967 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55da4e463800 0x55da4e43a680 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Oct 14 06:06:15 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Oct 14 06:06:15 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Oct 14 06:06:15 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b97600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Oct 14 06:06:15 localhost podman[315487]: 2025-10-14 10:06:15.72171391 +0000 UTC m=+0.075952969 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:06:15 localhost podman[315487]: 2025-10-14 10:06:15.736034309 +0000 UTC m=+0.090273308 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:06:15 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:06:15 localhost systemd[1]: var-lib-containers-storage-overlay-b0914f74994d9fdc724c13ecc355cddc457677a1f197adddbcf7bc212124500d-merged.mount: Deactivated successfully. Oct 14 06:06:15 localhost systemd[1]: var-lib-containers-storage-overlay-794e3357b9201828e8ded51a8263ea3f0ad4db588fac4a39a3be526ce9d784ee-merged.mount: Deactivated successfully. Oct 14 06:06:16 localhost nova_compute[297686]: 2025-10-14 10:06:16.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:16 localhost nova_compute[297686]: 2025-10-14 10:06:16.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 06:06:16 localhost systemd[1]: ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf@mon.np0005486733.service: Deactivated successfully. Oct 14 06:06:16 localhost systemd[1]: Stopped Ceph mon.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 06:06:16 localhost systemd[1]: ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf@mon.np0005486733.service: Consumed 12.437s CPU time. Oct 14 06:06:16 localhost systemd[1]: Reloading. Oct 14 06:06:16 localhost systemd-rc-local-generator[315695]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:06:16 localhost systemd-sysv-generator[315698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:06:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:06:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:06:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:06:17 localhost podman[315711]: 2025-10-14 10:06:17.059703157 +0000 UTC m=+0.063235907 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:06:17 localhost podman[315711]: 2025-10-14 10:06:17.072102898 +0000 UTC m=+0.075635668 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:06:17 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:06:17 localhost podman[315710]: 2025-10-14 10:06:17.144425503 +0000 UTC m=+0.146530200 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd) Oct 14 06:06:17 localhost podman[315710]: 2025-10-14 10:06:17.181777477 +0000 UTC m=+0.183882174 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:06:17 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:06:17 localhost nova_compute[297686]: 2025-10-14 10:06:17.265 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:17 localhost nova_compute[297686]: 2025-10-14 10:06:17.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:18 localhost nova_compute[297686]: 2025-10-14 10:06:18.250 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:18 localhost nova_compute[297686]: 2025-10-14 10:06:18.274 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:19 localhost nova_compute[297686]: 2025-10-14 10:06:19.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:19 localhost nova_compute[297686]: 2025-10-14 10:06:19.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:06:20 localhost nova_compute[297686]: 2025-10-14 10:06:20.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:20 localhost nova_compute[297686]: 2025-10-14 10:06:20.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:21 localhost nova_compute[297686]: 2025-10-14 10:06:21.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:21 localhost nova_compute[297686]: 2025-10-14 10:06:21.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:06:21 localhost nova_compute[297686]: 2025-10-14 10:06:21.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:06:22 localhost nova_compute[297686]: 2025-10-14 10:06:22.350 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:06:22 localhost nova_compute[297686]: 2025-10-14 10:06:22.351 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:06:22 localhost nova_compute[297686]: 2025-10-14 10:06:22.351 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:06:22 localhost nova_compute[297686]: 2025-10-14 10:06:22.352 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:06:22 localhost nova_compute[297686]: 2025-10-14 10:06:22.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.412 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.457 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.457 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.458 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.459 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.459 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.460 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.489 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.490 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.490 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.491 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.491 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.930 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.979 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:06:25 localhost nova_compute[297686]: 2025-10-14 10:06:25.979 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.194 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.196 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11454MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.196 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.197 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.585 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.586 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.586 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:06:26 localhost nova_compute[297686]: 2025-10-14 10:06:26.946 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.127 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.127 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.145 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.172 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.220 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.648 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.655 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:06:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:06:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:06:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:06:27 localhost podman[316079]: 2025-10-14 10:06:27.739613363 +0000 UTC m=+0.070713688 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter) Oct 14 06:06:27 localhost systemd[1]: tmp-crun.Y9JfbW.mount: Deactivated successfully. Oct 14 06:06:27 localhost podman[316078]: 2025-10-14 10:06:27.758833031 +0000 UTC m=+0.089620656 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0) Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.771 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.774 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.774 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.775 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:06:27 localhost podman[316079]: 2025-10-14 10:06:27.820840111 +0000 UTC m=+0.151940456 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public) Oct 14 06:06:27 localhost podman[316080]: 2025-10-14 10:06:27.828796395 +0000 UTC m=+0.153107041 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:06:27 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:06:27 localhost podman[316080]: 2025-10-14 10:06:27.844214458 +0000 UTC m=+0.168525124 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:06:27 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:06:27 localhost podman[316078]: 2025-10-14 10:06:27.862633441 +0000 UTC m=+0.193420996 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 06:06:27 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:06:27 localhost nova_compute[297686]: 2025-10-14 10:06:27.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:28 localhost podman[316174]: Oct 14 06:06:28 localhost podman[316174]: 2025-10-14 10:06:28.19193672 +0000 UTC m=+0.062153285 container create 0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_borg, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public) Oct 14 06:06:28 localhost systemd[1]: Started libpod-conmon-0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858.scope. Oct 14 06:06:28 localhost systemd[1]: Started libcrun container. Oct 14 06:06:28 localhost podman[316174]: 2025-10-14 10:06:28.258042606 +0000 UTC m=+0.128259171 container init 0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_borg, ceph=True, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:06:28 localhost podman[316174]: 2025-10-14 10:06:28.160460966 +0000 UTC m=+0.030677561 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:28 localhost podman[316174]: 2025-10-14 10:06:28.267102923 +0000 UTC m=+0.137319468 container start 0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_borg, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Oct 14 06:06:28 localhost podman[316174]: 2025-10-14 10:06:28.267359701 +0000 UTC m=+0.137576296 container attach 0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_borg, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, RELEASE=main, ceph=True, io.openshift.expose-services=) Oct 14 06:06:28 localhost frosty_borg[316189]: 167 167 Oct 14 06:06:28 localhost podman[316174]: 2025-10-14 10:06:28.27222161 +0000 UTC m=+0.142438185 container died 0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_borg, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Oct 14 06:06:28 localhost systemd[1]: libpod-0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858.scope: Deactivated successfully. Oct 14 06:06:28 localhost podman[248187]: time="2025-10-14T10:06:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:06:28 localhost podman[248187]: @ - - [14/Oct/2025:10:06:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147008 "" "Go-http-client/1.1" Oct 14 06:06:28 localhost podman[248187]: @ - - [14/Oct/2025:10:06:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19668 "" "Go-http-client/1.1" Oct 14 06:06:28 localhost podman[316194]: 2025-10-14 10:06:28.436844922 +0000 UTC m=+0.151400040 container remove 0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_borg, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Oct 14 06:06:28 localhost systemd[1]: libpod-conmon-0fd7778de9af47388223a62b8369698bc1da7ac1e27274cb52516c2915709858.scope: Deactivated successfully. Oct 14 06:06:28 localhost systemd[1]: tmp-crun.dWzhUY.mount: Deactivated successfully. Oct 14 06:06:29 localhost podman[316263]: Oct 14 06:06:29 localhost podman[316263]: 2025-10-14 10:06:29.118816195 +0000 UTC m=+0.066844079 container create 96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_margulis, version=7, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph) Oct 14 06:06:29 localhost systemd[1]: Started libpod-conmon-96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206.scope. Oct 14 06:06:29 localhost systemd[1]: Started libcrun container. Oct 14 06:06:29 localhost podman[316263]: 2025-10-14 10:06:29.181119193 +0000 UTC m=+0.129147117 container init 96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_margulis, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, ceph=True) Oct 14 06:06:29 localhost podman[316263]: 2025-10-14 10:06:29.190750048 +0000 UTC m=+0.138777962 container start 96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_margulis, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_BRANCH=main, release=553, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main) Oct 14 06:06:29 localhost podman[316263]: 2025-10-14 10:06:29.190939825 +0000 UTC m=+0.138967749 container attach 96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_margulis, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , release=553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12) Oct 14 06:06:29 localhost sad_margulis[316278]: 167 167 Oct 14 06:06:29 localhost systemd[1]: libpod-96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206.scope: Deactivated successfully. Oct 14 06:06:29 localhost podman[316263]: 2025-10-14 10:06:29.196845596 +0000 UTC m=+0.144873520 container died 96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_margulis, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, version=7, vcs-type=git, release=553, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:06:29 localhost podman[316263]: 2025-10-14 10:06:29.098060389 +0000 UTC m=+0.046088303 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:29 localhost podman[316284]: 2025-10-14 10:06:29.295637592 +0000 UTC m=+0.087156491 container remove 96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_margulis, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64) Oct 14 06:06:29 localhost systemd[1]: libpod-conmon-96564ec21a62ae65ef03c60f0160bc2a411c4e642fbae5b0c55b3ff7d6c8b206.scope: Deactivated successfully. Oct 14 06:06:29 localhost systemd[1]: var-lib-containers-storage-overlay-f21d5189a3fac97aa13bc9a512167868464cae96bbe65770e2232c383008586e-merged.mount: Deactivated successfully. Oct 14 06:06:30 localhost podman[316359]: Oct 14 06:06:30 localhost podman[316359]: 2025-10-14 10:06:30.153279044 +0000 UTC m=+0.083521909 container create e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_benz, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, maintainer=Guillaume Abrioux ) Oct 14 06:06:30 localhost systemd[1]: Started libpod-conmon-e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e.scope. Oct 14 06:06:30 localhost systemd[1]: Started libcrun container. Oct 14 06:06:30 localhost nova_compute[297686]: 2025-10-14 10:06:30.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:30 localhost podman[316359]: 2025-10-14 10:06:30.120660495 +0000 UTC m=+0.050903400 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:30 localhost podman[316359]: 2025-10-14 10:06:30.227766937 +0000 UTC m=+0.158009802 container init e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_benz, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553) Oct 14 06:06:30 localhost podman[316359]: 2025-10-14 10:06:30.24189935 +0000 UTC m=+0.172142205 container start e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_benz, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, release=553) Oct 14 06:06:30 localhost podman[316359]: 2025-10-14 10:06:30.242273881 +0000 UTC m=+0.172516736 container attach e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_benz, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:06:30 localhost priceless_benz[316374]: 167 167 Oct 14 06:06:30 localhost systemd[1]: libpod-e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e.scope: Deactivated successfully. Oct 14 06:06:30 localhost podman[316359]: 2025-10-14 10:06:30.245458788 +0000 UTC m=+0.175701713 container died e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_benz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc.) Oct 14 06:06:30 localhost podman[316379]: 2025-10-14 10:06:30.345533134 +0000 UTC m=+0.087605895 container remove e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_benz, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:06:30 localhost systemd[1]: libpod-conmon-e3e2a12fa0611c32aa3e46b3e7a964487ea3ace576b005380bc584be83497f1e.scope: Deactivated successfully. Oct 14 06:06:30 localhost systemd[1]: tmp-crun.EUd0s6.mount: Deactivated successfully. Oct 14 06:06:30 localhost systemd[1]: var-lib-containers-storage-overlay-8d213e192d542e1b71c23fbd4204b3e725f6800a744f1f8c100f7eff229e7196-merged.mount: Deactivated successfully. Oct 14 06:06:31 localhost podman[316456]: Oct 14 06:06:31 localhost podman[316456]: 2025-10-14 10:06:31.150490053 +0000 UTC m=+0.050156768 container create 35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_pike, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55) Oct 14 06:06:31 localhost systemd[1]: Started libpod-conmon-35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede.scope. Oct 14 06:06:31 localhost systemd[1]: Started libcrun container. Oct 14 06:06:31 localhost podman[316456]: 2025-10-14 10:06:31.201956439 +0000 UTC m=+0.101623154 container init 35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_pike, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-type=git, architecture=x86_64, name=rhceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph) Oct 14 06:06:31 localhost podman[316456]: 2025-10-14 10:06:31.209184171 +0000 UTC m=+0.108850936 container start 35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_pike, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main) Oct 14 06:06:31 localhost podman[316456]: 2025-10-14 10:06:31.209456429 +0000 UTC m=+0.109123154 container attach 35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_pike, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True) Oct 14 06:06:31 localhost relaxed_pike[316471]: 167 167 Oct 14 06:06:31 localhost systemd[1]: libpod-35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede.scope: Deactivated successfully. Oct 14 06:06:31 localhost podman[316456]: 2025-10-14 10:06:31.21371203 +0000 UTC m=+0.113378825 container died 35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_pike, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Oct 14 06:06:31 localhost podman[316456]: 2025-10-14 10:06:31.132186723 +0000 UTC m=+0.031853498 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:31 localhost podman[316476]: 2025-10-14 10:06:31.289830912 +0000 UTC m=+0.068735687 container remove 35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_pike, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container) Oct 14 06:06:31 localhost systemd[1]: libpod-conmon-35dbd457cf5922c56762025e5961521d7fb9c5178a446eeff2bde40ff087fede.scope: Deactivated successfully. Oct 14 06:06:31 localhost systemd[1]: var-lib-containers-storage-overlay-587df8fa99f53e33a28b6d6c926fe3c7ec4c9c1e928d78f74adc55e6ef72b034-merged.mount: Deactivated successfully. Oct 14 06:06:31 localhost podman[316580]: Oct 14 06:06:31 localhost podman[316580]: 2025-10-14 10:06:31.943134865 +0000 UTC m=+0.045361271 container create d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_carver, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, release=553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Oct 14 06:06:31 localhost systemd[1]: Started libpod-conmon-d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f.scope. Oct 14 06:06:31 localhost systemd[1]: Started libcrun container. Oct 14 06:06:31 localhost podman[316580]: 2025-10-14 10:06:31.992298701 +0000 UTC m=+0.094525097 container init d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_carver, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, vcs-type=git, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc.) Oct 14 06:06:31 localhost nostalgic_carver[316607]: 167 167 Oct 14 06:06:31 localhost podman[316580]: 2025-10-14 10:06:31.999010587 +0000 UTC m=+0.101237023 container start d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_carver, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, GIT_BRANCH=main, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main) Oct 14 06:06:31 localhost podman[316580]: 2025-10-14 10:06:31.999318836 +0000 UTC m=+0.101545252 container attach d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_carver, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:06:31 localhost systemd[1]: libpod-d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f.scope: Deactivated successfully. Oct 14 06:06:32 localhost podman[316580]: 2025-10-14 10:06:32.000215203 +0000 UTC m=+0.102441609 container died d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_carver, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , version=7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:06:32 localhost podman[316580]: 2025-10-14 10:06:31.92471727 +0000 UTC m=+0.026943686 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:32 localhost podman[316612]: 2025-10-14 10:06:32.069588968 +0000 UTC m=+0.060022309 container remove d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_carver, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main) Oct 14 06:06:32 localhost systemd[1]: libpod-conmon-d615ee435803029aac209ed72086f2485f39a4565d140a10128282286d75145f.scope: Deactivated successfully. Oct 14 06:06:32 localhost podman[316696]: Oct 14 06:06:32 localhost podman[316696]: 2025-10-14 10:06:32.366780643 +0000 UTC m=+0.071765939 container create 12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_stonebraker, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, version=7, build-date=2025-09-24T08:57:55, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-type=git) Oct 14 06:06:32 localhost systemd[1]: Started libpod-conmon-12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189.scope. Oct 14 06:06:32 localhost systemd[1]: Started libcrun container. Oct 14 06:06:32 localhost podman[316696]: 2025-10-14 10:06:32.412659549 +0000 UTC m=+0.117644845 container init 12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_stonebraker, release=553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, GIT_CLEAN=True) Oct 14 06:06:32 localhost podman[316696]: 2025-10-14 10:06:32.417845847 +0000 UTC m=+0.122831183 container start 12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_stonebraker, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public) Oct 14 06:06:32 localhost podman[316696]: 2025-10-14 10:06:32.418095265 +0000 UTC m=+0.123080561 container attach 12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_stonebraker, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.expose-services=, name=rhceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main) Oct 14 06:06:32 localhost bold_stonebraker[316712]: 167 167 Oct 14 06:06:32 localhost systemd[1]: libpod-12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189.scope: Deactivated successfully. Oct 14 06:06:32 localhost podman[316696]: 2025-10-14 10:06:32.420922062 +0000 UTC m=+0.125907368 container died 12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_stonebraker, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, version=7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, io.openshift.expose-services=, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7) Oct 14 06:06:32 localhost podman[316696]: 2025-10-14 10:06:32.343616043 +0000 UTC m=+0.048601379 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:32 localhost podman[316717]: 2025-10-14 10:06:32.500891671 +0000 UTC m=+0.069554952 container remove 12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_stonebraker, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Oct 14 06:06:32 localhost systemd[1]: libpod-conmon-12651d7d157df19ad448b46069cd6ce06eb386b4e6dd402349aecdaece20b189.scope: Deactivated successfully. Oct 14 06:06:32 localhost podman[316733]: Oct 14 06:06:32 localhost podman[316733]: 2025-10-14 10:06:32.586639168 +0000 UTC m=+0.056051148 container create 6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_yalow, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, io.buildah.version=1.33.12, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Oct 14 06:06:32 localhost systemd[1]: Started libpod-conmon-6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9.scope. Oct 14 06:06:32 localhost systemd[1]: Started libcrun container. Oct 14 06:06:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efb0d03432210851604a8641529d2de9be9436561cfa0f8e2ba461dc3b7f0aad/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efb0d03432210851604a8641529d2de9be9436561cfa0f8e2ba461dc3b7f0aad/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efb0d03432210851604a8641529d2de9be9436561cfa0f8e2ba461dc3b7f0aad/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efb0d03432210851604a8641529d2de9be9436561cfa0f8e2ba461dc3b7f0aad/merged/var/lib/ceph/mon/ceph-np0005486733 supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:32 localhost podman[316733]: 2025-10-14 10:06:32.637830256 +0000 UTC m=+0.107242206 container init 6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_yalow, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, GIT_CLEAN=True) Oct 14 06:06:32 localhost podman[316733]: 2025-10-14 10:06:32.643556571 +0000 UTC m=+0.112968521 container start 6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_yalow, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Oct 14 06:06:32 localhost podman[316733]: 2025-10-14 10:06:32.645727798 +0000 UTC m=+0.115139768 container attach 6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_yalow, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main) Oct 14 06:06:32 localhost podman[316733]: 2025-10-14 10:06:32.56385883 +0000 UTC m=+0.033270800 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:32 localhost systemd[1]: libpod-6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9.scope: Deactivated successfully. Oct 14 06:06:32 localhost systemd[1]: var-lib-containers-storage-overlay-b7c7e598cd2837661495df1f32c5386cc2f7c3914c75c170ad9332faac596443-merged.mount: Deactivated successfully. Oct 14 06:06:32 localhost podman[316733]: 2025-10-14 10:06:32.822614807 +0000 UTC m=+0.292026787 container died 6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_yalow, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git) Oct 14 06:06:32 localhost systemd[1]: var-lib-containers-storage-overlay-efb0d03432210851604a8641529d2de9be9436561cfa0f8e2ba461dc3b7f0aad-merged.mount: Deactivated successfully. Oct 14 06:06:32 localhost podman[316809]: 2025-10-14 10:06:32.945726398 +0000 UTC m=+0.214428930 container remove 6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_yalow, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, version=7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, release=553, maintainer=Guillaume Abrioux , name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:06:32 localhost nova_compute[297686]: 2025-10-14 10:06:32.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:32 localhost systemd[1]: libpod-conmon-6d88d4d9cdc5d0f75e15a2d087f06a65087dc7b28ef8ada08f07e585b5d657b9.scope: Deactivated successfully. Oct 14 06:06:32 localhost systemd[1]: Reloading. Oct 14 06:06:33 localhost systemd-sysv-generator[316899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:06:33 localhost systemd-rc-local-generator[316894]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:06:33 localhost podman[316862]: 2025-10-14 10:06:33.123193294 +0000 UTC m=+0.069456648 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, architecture=x86_64, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git) Oct 14 06:06:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:06:33 localhost podman[316862]: 2025-10-14 10:06:33.222895669 +0000 UTC m=+0.169159083 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main) Oct 14 06:06:33 localhost systemd[1]: Reloading. Oct 14 06:06:33 localhost systemd-sysv-generator[316980]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Oct 14 06:06:33 localhost systemd-rc-local-generator[316975]: /etc/rc.d/rc.local is not marked executable, skipping. Oct 14 06:06:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Oct 14 06:06:33 localhost systemd[1]: Starting Ceph mon.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf... Oct 14 06:06:34 localhost podman[317075]: Oct 14 06:06:34 localhost podman[317075]: 2025-10-14 10:06:34.069428552 +0000 UTC m=+0.083885061 container create 3b13a3d859b6556c5dfe3e5f7372f6243db59a8d62a954c568cfba8c70efda13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:06:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/99caf85c12354a453a03a1c1c658271c3f9a239d2b027d997cf25a63187b29b6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/99caf85c12354a453a03a1c1c658271c3f9a239d2b027d997cf25a63187b29b6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/99caf85c12354a453a03a1c1c658271c3f9a239d2b027d997cf25a63187b29b6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/99caf85c12354a453a03a1c1c658271c3f9a239d2b027d997cf25a63187b29b6/merged/var/lib/ceph/mon/ceph-np0005486733 supports timestamps until 2038 (0x7fffffff) Oct 14 06:06:34 localhost podman[317075]: 2025-10-14 10:06:34.033892983 +0000 UTC m=+0.048349492 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:06:34 localhost podman[317075]: 2025-10-14 10:06:34.141159009 +0000 UTC m=+0.155615528 container init 3b13a3d859b6556c5dfe3e5f7372f6243db59a8d62a954c568cfba8c70efda13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, io.openshift.tags=rhceph ceph, release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:06:34 localhost podman[317075]: 2025-10-14 10:06:34.152833237 +0000 UTC m=+0.167289746 container start 3b13a3d859b6556c5dfe3e5f7372f6243db59a8d62a954c568cfba8c70efda13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mon-np0005486733, ceph=True, distribution-scope=public, name=rhceph, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553) Oct 14 06:06:34 localhost bash[317075]: 3b13a3d859b6556c5dfe3e5f7372f6243db59a8d62a954c568cfba8c70efda13 Oct 14 06:06:34 localhost systemd[1]: Started Ceph mon.np0005486733 for fcadf6e2-9176-5818-a8d0-37b19acf8eaf. Oct 14 06:06:34 localhost ceph-mon[317114]: set uid:gid to 167:167 (ceph:ceph) Oct 14 06:06:34 localhost ceph-mon[317114]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Oct 14 06:06:34 localhost ceph-mon[317114]: pidfile_write: ignore empty --pid-file Oct 14 06:06:34 localhost ceph-mon[317114]: load: jerasure load: lrc Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: RocksDB version: 7.9.2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Git sha 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Compile date 2025-09-23 00:00:00 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: DB SUMMARY Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: DB Session ID: ZS6W3A0Q266OBGAU6ARP Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: CURRENT file: CURRENT Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: IDENTITY file: IDENTITY Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005486733/store.db dir, Total Num: 0, files: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005486733/store.db: 000004.log size: 636 ; Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.error_if_exists: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.create_if_missing: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.paranoid_checks: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.flush_verify_memtable_count: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.env: 0x55c2d41dd9e0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.fs: PosixFileSystem Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.info_log: 0x55c2d5d96d20 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_file_opening_threads: 16 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.statistics: (nil) Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.use_fsync: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_log_file_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_manifest_file_size: 1073741824 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.log_file_time_to_roll: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.keep_log_file_num: 1000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.recycle_log_file_num: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.allow_fallocate: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.allow_mmap_reads: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.allow_mmap_writes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.use_direct_reads: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.create_missing_column_families: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.db_log_dir: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.wal_dir: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.table_cache_numshardbits: 6 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.WAL_ttl_seconds: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.WAL_size_limit_MB: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.manifest_preallocation_size: 4194304 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.is_fd_close_on_exec: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.advise_random_on_open: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.db_write_buffer_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.write_buffer_manager: 0x55c2d5da7540 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.access_hint_on_compaction_start: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.random_access_max_buffer_size: 1048576 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.use_adaptive_mutex: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.rate_limiter: (nil) Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.wal_recovery_mode: 2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.enable_thread_tracking: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.enable_pipelined_write: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.unordered_write: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.allow_concurrent_memtable_write: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.write_thread_max_yield_usec: 100 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.write_thread_slow_yield_usec: 3 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.row_cache: None Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.wal_filter: None Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.avoid_flush_during_recovery: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.allow_ingest_behind: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.two_write_queues: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.manual_wal_flush: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.wal_compression: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.atomic_flush: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.persist_stats_to_disk: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.write_dbid_to_manifest: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.log_readahead_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.file_checksum_gen_factory: Unknown Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.best_efforts_recovery: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.allow_data_in_errors: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.db_host_id: __hostname__ Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.enforce_single_del_contracts: true Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_background_jobs: 2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_background_compactions: -1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_subcompactions: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.avoid_flush_during_shutdown: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.delayed_write_rate : 16777216 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_total_wal_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.stats_dump_period_sec: 600 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.stats_persist_period_sec: 600 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.stats_history_buffer_size: 1048576 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_open_files: -1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bytes_per_sync: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.wal_bytes_per_sync: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.strict_bytes_per_sync: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_readahead_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_background_flushes: -1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Compression algorithms supported: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kZSTD supported: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kXpressCompression supported: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kBZip2Compression supported: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kLZ4Compression supported: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kZlibCompression supported: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kLZ4HCCompression supported: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: #011kSnappyCompression supported: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Fast CRC32 supported: Supported on x86 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: DMutex implementation: pthread_mutex_t Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005486733/store.db/MANIFEST-000005 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.comparator: leveldb.BytewiseComparator Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.merge_operator: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_filter: None Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_filter_factory: None Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.sst_partitioner_factory: None Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.memtable_factory: SkipListFactory Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.table_factory: BlockBasedTable Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c2d5d96980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55c2d5d93350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.write_buffer_size: 33554432 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_write_buffer_number: 2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression: NoCompression Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression: Disabled Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.prefix_extractor: nullptr Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.num_levels: 7 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.level: 32767 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.enabled: false Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.window_bits: -14 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.level: 32767 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.strategy: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.parallel_threads: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.enabled: false Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.level0_stop_writes_trigger: 36 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.target_file_size_base: 67108864 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.target_file_size_multiplier: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_base: 268435456 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_compaction_bytes: 1677721600 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.arena_block_size: 1048576 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.disable_auto_compactions: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_style: kCompactionStyleLevel Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.table_properties_collectors: Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.inplace_update_support: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.inplace_update_num_locks: 10000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.memtable_whole_key_filtering: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.memtable_huge_page_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.bloom_locality: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.max_successive_merges: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.optimize_filters_for_hits: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.paranoid_file_checks: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.force_consistency_checks: 1 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.report_bg_io_stats: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.ttl: 2592000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.periodic_compaction_seconds: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.preclude_last_level_data_seconds: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.preserve_internal_time_seconds: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.enable_blob_files: false Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.min_blob_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.blob_file_size: 268435456 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.blob_compression_type: NoCompression Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.enable_blob_garbage_collection: false Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.blob_compaction_readahead_size: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.blob_file_starting_level: 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005486733/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436394202965, "job": 1, "event": "recovery_started", "wal_files": [4]} Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436394206010, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436394206181, "job": 1, "event": "recovery_finished"} Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c2d5dbae00 Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: DB pointer 0x55c2d5eb0000 Oct 14 06:06:34 localhost ceph-mon[317114]: mon.np0005486733 does not exist in monmap, will attempt to join an existing cluster Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:06:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.72 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.72 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c2d5d93350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Oct 14 06:06:34 localhost ceph-mon[317114]: using public_addr v2:172.18.0.105:0/0 -> [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] Oct 14 06:06:34 localhost ceph-mon[317114]: starting mon.np0005486733 rank -1 at public addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] at bind addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005486733 fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:06:34 localhost ceph-mon[317114]: mon.np0005486733@-1(???) e0 preinit fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf Oct 14 06:06:35 localhost nova_compute[297686]: 2025-10-14 10:06:35.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing) e16 sync_obtain_latest_monmap Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing) e16 sync_obtain_latest_monmap obtained monmap e16 Oct 14 06:06:36 localhost podman[317360]: 2025-10-14 10:06:36.385132011 +0000 UTC m=+0.095746065 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, name=rhceph) Oct 14 06:06:36 localhost podman[317360]: 2025-10-14 10:06:36.489199769 +0000 UTC m=+0.199813833 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhceph, release=553, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).mds e16 new map Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-14T08:11:54.831494+0000#012modified#0112025-10-14T10:00:48.835986+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01178#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26888}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26888 members: 26888#012[mds.mds.np0005486732.xkownj{0:26888} state up:active seq 13 addr [v2:172.18.0.107:6808/1205328170,v1:172.18.0.107:6809/1205328170] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005486733.tvstmf{-1:17244} state up:standby seq 1 addr [v2:172.18.0.108:6808/3626555326,v1:172.18.0.108:6809/3626555326] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005486731.onyaog{-1:17256} state up:standby seq 1 addr [v2:172.18.0.106:6808/799411272,v1:172.18.0.106:6809/799411272] compat {c=[1],r=[1],i=[17ff]}] Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).osd e82 crush map has features 3314933000852226048, adjusting msgr requires Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).osd e82 crush map has features 288514051259236352, adjusting msgr requires Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).osd e82 crush map has features 288514051259236352, adjusting msgr requires Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).osd e82 crush map has features 288514051259236352, adjusting msgr requires Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:06:36 localhost ceph-mon[317114]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:06:36 localhost ceph-mon[317114]: Deploying daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON) Oct 14 06:06:36 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:36 localhost ceph-mon[317114]: mon.np0005486733@-1(synchronizing).paxosservice(auth 1..41) refresh upgraded, format 0 -> 3 Oct 14 06:06:36 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55da44b971e0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Oct 14 06:06:37 localhost nova_compute[297686]: 2025-10-14 10:06:37.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:38 localhost ceph-mon[317114]: mon.np0005486733@-1(probing) e17 my rank is now 2 (was -1) Oct 14 06:06:38 localhost ceph-mon[317114]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:06:38 localhost ceph-mon[317114]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 Oct 14 06:06:38 localhost ceph-mon[317114]: mon.np0005486733@2(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:06:38 localhost openstack_network_exporter[250374]: ERROR 10:06:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:06:38 localhost openstack_network_exporter[250374]: ERROR 10:06:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:06:38 localhost openstack_network_exporter[250374]: ERROR 10:06:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:06:38 localhost openstack_network_exporter[250374]: ERROR 10:06:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:06:38 localhost openstack_network_exporter[250374]: Oct 14 06:06:38 localhost openstack_network_exporter[250374]: ERROR 10:06:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:06:38 localhost openstack_network_exporter[250374]: Oct 14 06:06:40 localhost nova_compute[297686]: 2025-10-14 10:06:40.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:06:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:06:41 localhost podman[317484]: 2025-10-14 10:06:41.788472146 +0000 UTC m=+0.122852064 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:06:41 localhost podman[317484]: 2025-10-14 10:06:41.798195904 +0000 UTC m=+0.132575782 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Oct 14 06:06:41 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:06:41 localhost podman[317481]: 2025-10-14 10:06:41.765959946 +0000 UTC m=+0.107624808 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:06:41 localhost podman[317481]: 2025-10-14 10:06:41.852181958 +0000 UTC m=+0.193846740 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:06:41 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:06:42 localhost nova_compute[297686]: 2025-10-14 10:06:42.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486732 calling monitor election Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486731 calling monitor election Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486731 is new leader, mons np0005486731,np0005486732 in quorum (ranks 0,1) Oct 14 06:06:43 localhost ceph-mon[317114]: Health check failed: 1/3 mons down, quorum np0005486731,np0005486732 (MON_DOWN) Oct 14 06:06:43 localhost ceph-mon[317114]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s); 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005486731,np0005486732 Oct 14 06:06:43 localhost ceph-mon[317114]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) Oct 14 06:06:43 localhost ceph-mon[317114]: daemon mon.np0005486733 on np0005486733.localdomain is in unknown state Oct 14 06:06:43 localhost ceph-mon[317114]: [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:06:43 localhost ceph-mon[317114]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:06:43 localhost ceph-mon[317114]: stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:06:43 localhost ceph-mon[317114]: [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:06:43 localhost ceph-mon[317114]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:06:43 localhost ceph-mon[317114]: stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:06:43 localhost ceph-mon[317114]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005486731,np0005486732 Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486733 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Oct 14 06:06:43 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:43 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:43 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:06:43 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:43 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:43 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:43 localhost ceph-mon[317114]: log_channel(cluster) log [INF] : mon.np0005486733 calling monitor election Oct 14 06:06:43 localhost ceph-mon[317114]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486733@2(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486733@2(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Oct 14 06:06:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Oct 14 06:06:43 localhost ceph-mon[317114]: mgrc update_daemon_metadata mon.np0005486733 metadata {addrs=[v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005486733.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005486733.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Oct 14 06:06:44 localhost ceph-mon[317114]: mon.np0005486733 calling monitor election Oct 14 06:06:44 localhost ceph-mon[317114]: mon.np0005486732 calling monitor election Oct 14 06:06:44 localhost ceph-mon[317114]: mon.np0005486731 calling monitor election Oct 14 06:06:44 localhost ceph-mon[317114]: mon.np0005486733 calling monitor election Oct 14 06:06:44 localhost ceph-mon[317114]: mon.np0005486731 is new leader, mons np0005486731,np0005486732,np0005486733 in quorum (ranks 0,1,2) Oct 14 06:06:44 localhost ceph-mon[317114]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005486731,np0005486732) Oct 14 06:06:44 localhost ceph-mon[317114]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s); 2 stray daemon(s) not managed by cephadm; 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:06:44 localhost ceph-mon[317114]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) Oct 14 06:06:44 localhost ceph-mon[317114]: daemon mon.np0005486733 on np0005486733.localdomain is in unknown state Oct 14 06:06:44 localhost ceph-mon[317114]: [WRN] CEPHADM_STRAY_DAEMON: 2 stray daemon(s) not managed by cephadm Oct 14 06:06:44 localhost ceph-mon[317114]: stray daemon mgr.np0005486728.giajub on host np0005486728.localdomain not managed by cephadm Oct 14 06:06:44 localhost ceph-mon[317114]: stray daemon mgr.np0005486729.xpybho on host np0005486729.localdomain not managed by cephadm Oct 14 06:06:44 localhost ceph-mon[317114]: [WRN] CEPHADM_STRAY_HOST: 2 stray host(s) with 2 daemon(s) not managed by cephadm Oct 14 06:06:44 localhost ceph-mon[317114]: stray host np0005486728.localdomain has 1 stray daemons: ['mgr.np0005486728.giajub'] Oct 14 06:06:44 localhost ceph-mon[317114]: stray host np0005486729.localdomain has 1 stray daemons: ['mgr.np0005486729.xpybho'] Oct 14 06:06:45 localhost ceph-mon[317114]: Reconfiguring crash.np0005486731 (monmap changed)... Oct 14 06:06:45 localhost ceph-mon[317114]: Reconfiguring daemon crash.np0005486731 on np0005486731.localdomain Oct 14 06:06:45 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:45 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:45 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:06:45 localhost nova_compute[297686]: 2025-10-14 10:06:45.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:45 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Oct 14 06:06:45 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3191562138' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Oct 14 06:06:46 localhost ceph-mon[317114]: Reconfiguring osd.2 (monmap changed)... Oct 14 06:06:46 localhost ceph-mon[317114]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:06:46 localhost ceph-mon[317114]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s)) Oct 14 06:06:46 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:46 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:46 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:06:46 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:06:46 localhost podman[317856]: 2025-10-14 10:06:46.734272805 +0000 UTC m=+0.074447622 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3) Oct 14 06:06:46 localhost podman[317856]: 2025-10-14 10:06:46.744221579 +0000 UTC m=+0.084396386 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:06:46 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:06:47 localhost ceph-mon[317114]: Reconfiguring osd.4 (monmap changed)... Oct 14 06:06:47 localhost ceph-mon[317114]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:06:47 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:47 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:47 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486731.onyaog", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:06:47 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:06:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:06:47 localhost podman[317877]: 2025-10-14 10:06:47.73725237 +0000 UTC m=+0.078159295 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:06:47 localhost podman[317877]: 2025-10-14 10:06:47.742311065 +0000 UTC m=+0.083218010 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:06:47 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:06:47 localhost systemd[1]: tmp-crun.eM2FsA.mount: Deactivated successfully. Oct 14 06:06:47 localhost podman[317876]: 2025-10-14 10:06:47.793576065 +0000 UTC m=+0.135713248 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:06:47 localhost podman[317876]: 2025-10-14 10:06:47.80580908 +0000 UTC m=+0.147946243 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 06:06:47 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:06:47 localhost nova_compute[297686]: 2025-10-14 10:06:47.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:48 localhost ceph-mon[317114]: Reconfiguring mds.mds.np0005486731.onyaog (monmap changed)... Oct 14 06:06:48 localhost ceph-mon[317114]: Reconfiguring daemon mds.mds.np0005486731.onyaog on np0005486731.localdomain Oct 14 06:06:48 localhost ceph-mon[317114]: Reconfig service osd.default_drive_group Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486731.swasqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/664587782' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e82 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e82 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 e83: 6 total, 6 up, 6 in Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr handle_mgr_map Activating! Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr handle_mgr_map I am now activating Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486731"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mon metadata", "id": "np0005486731"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486732"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mon metadata", "id": "np0005486732"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005486733"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mon metadata", "id": "np0005486733"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005486733.tvstmf"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mds metadata", "who": "mds.np0005486733.tvstmf"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).mds e16 all = 0 Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005486731.onyaog"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mds metadata", "who": "mds.np0005486731.onyaog"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).mds e16 all = 0 Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005486732.xkownj"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mds metadata", "who": "mds.np0005486732.xkownj"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).mds e16 all = 0 Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486733.primvu", "id": "np0005486733.primvu"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr metadata", "who": "np0005486733.primvu", "id": "np0005486733.primvu"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486728.giajub", "id": "np0005486728.giajub"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr metadata", "who": "np0005486728.giajub", "id": "np0005486728.giajub"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486729.xpybho", "id": "np0005486729.xpybho"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr metadata", "who": "np0005486729.xpybho", "id": "np0005486729.xpybho"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486730.ddfidc", "id": "np0005486730.ddfidc"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr metadata", "who": "np0005486730.ddfidc", "id": "np0005486730.ddfidc"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486731.swasqz", "id": "np0005486731.swasqz"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr metadata", "who": "np0005486731.swasqz", "id": "np0005486731.swasqz"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata", "id": 0} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata", "id": 1} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata", "id": 2} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata", "id": 3} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata", "id": 4} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata", "id": 5} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mds metadata"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mds metadata"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).mds e16 all = 1 Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd metadata"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd metadata"} : dispatch Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon metadata"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mon metadata"} : dispatch Oct 14 06:06:48 localhost ceph-mgr[302471]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: balancer Oct 14 06:06:48 localhost ceph-mgr[302471]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: [balancer INFO root] Starting Oct 14 06:06:48 localhost ceph-mgr[302471]: [balancer INFO root] Optimize plan auto_2025-10-14_10:06:48 Oct 14 06:06:48 localhost ceph-mgr[302471]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Oct 14 06:06:48 localhost ceph-mgr[302471]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Oct 14 06:06:48 localhost ceph-mgr[302471]: [cephadm WARNING root] removing stray HostCache host record np0005486730.localdomain.devices.0 Oct 14 06:06:48 localhost ceph-mgr[302471]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005486730.localdomain.devices.0 Oct 14 06:06:48 localhost systemd[1]: session-69.scope: Deactivated successfully. Oct 14 06:06:48 localhost systemd[1]: session-69.scope: Consumed 28.430s CPU time. Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} : dispatch Oct 14 06:06:48 localhost systemd-logind[760]: Session 69 logged out. Waiting for processes to exit. Oct 14 06:06:48 localhost systemd-logind[760]: Removed session 69. Oct 14 06:06:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} v 0) Oct 14 06:06:48 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} : dispatch Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: cephadm Oct 14 06:06:48 localhost ceph-mgr[302471]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: crash Oct 14 06:06:48 localhost ceph-mgr[302471]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: devicehealth Oct 14 06:06:48 localhost ceph-mgr[302471]: [devicehealth INFO root] Starting Oct 14 06:06:48 localhost ceph-mgr[302471]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: iostat Oct 14 06:06:48 localhost ceph-mgr[302471]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: nfs Oct 14 06:06:48 localhost ceph-mgr[302471]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: orchestrator Oct 14 06:06:48 localhost ceph-mgr[302471]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: pg_autoscaler Oct 14 06:06:48 localhost ceph-mgr[302471]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:48 localhost ceph-mgr[302471]: mgr load Constructed class from module: progress Oct 14 06:06:48 localhost ceph-mgr[302471]: [progress INFO root] Loading... Oct 14 06:06:48 localhost ceph-mgr[302471]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Oct 14 06:06:48 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] _maybe_adjust Oct 14 06:06:48 localhost ceph-mgr[302471]: [progress INFO root] Loaded OSDMap, ready. Oct 14 06:06:48 localhost ceph-mgr[302471]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] recovery thread starting Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] starting setup Oct 14 06:06:49 localhost ceph-mgr[302471]: mgr load Constructed class from module: rbd_support Oct 14 06:06:49 localhost ceph-mgr[302471]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:49 localhost ceph-mgr[302471]: mgr load Constructed class from module: restful Oct 14 06:06:49 localhost ceph-mgr[302471]: [restful INFO root] server_addr: :: server_port: 8003 Oct 14 06:06:49 localhost ceph-mgr[302471]: [restful WARNING root] server not running: no certificate configured Oct 14 06:06:49 localhost ceph-mgr[302471]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:49 localhost ceph-mgr[302471]: mgr load Constructed class from module: status Oct 14 06:06:49 localhost ceph-mgr[302471]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:49 localhost ceph-mgr[302471]: mgr load Constructed class from module: telemetry Oct 14 06:06:49 localhost ceph-mgr[302471]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Oct 14 06:06:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/mirror_snapshot_schedule"} v 0) Oct 14 06:06:49 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/mirror_snapshot_schedule"} : dispatch Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: vms, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Oct 14 06:06:49 localhost ceph-mgr[302471]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Oct 14 06:06:49 localhost ceph-mgr[302471]: mgr load Constructed class from module: volumes Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: volumes, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: images, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: backups, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.055+0000 7f13fe141640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.055+0000 7f13fe141640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.055+0000 7f13fe141640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.055+0000 7f13fe141640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.055+0000 7f13fe141640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.061+0000 7f1401147640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.061+0000 7f1401147640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.061+0000 7f1401147640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.061+0000 7f1401147640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:06:49.061+0000 7f1401147640 -1 client.0 error registering admin socket command: (17) File exists Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] PerfHandler: starting Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_task_task: vms, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_task_task: volumes, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_task_task: images, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_task_task: backups, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] TaskHandler: starting Oct 14 06:06:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/trash_purge_schedule"} v 0) Oct 14 06:06:49 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/trash_purge_schedule"} : dispatch Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: vms, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: volumes, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: images, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: backups, start_after= Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Oct 14 06:06:49 localhost ceph-mgr[302471]: [rbd_support INFO root] setup complete Oct 14 06:06:49 localhost sshd[318058]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:06:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1019370049 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:06:49 localhost ceph-mon[317114]: Reconfiguring mgr.np0005486731.swasqz (monmap changed)... Oct 14 06:06:49 localhost ceph-mon[317114]: Reconfiguring daemon mgr.np0005486731.swasqz on np0005486731.localdomain Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' Oct 14 06:06:49 localhost ceph-mon[317114]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17415 172.18.0.107:0/230210271' entity='mgr.np0005486732.pasqzz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:06:49 localhost ceph-mon[317114]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: Activating manager daemon np0005486733.primvu Oct 14 06:06:49 localhost ceph-mon[317114]: from='client.? 172.18.0.200:0/664587782' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Oct 14 06:06:49 localhost ceph-mon[317114]: Manager daemon np0005486733.primvu is now available Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"}]': finished Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005486730.localdomain.devices.0"}]': finished Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/mirror_snapshot_schedule"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/mirror_snapshot_schedule"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/trash_purge_schedule"} : dispatch Oct 14 06:06:49 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486733.primvu/trash_purge_schedule"} : dispatch Oct 14 06:06:49 localhost systemd-logind[760]: New session 72 of user ceph-admin. Oct 14 06:06:49 localhost systemd[1]: Started Session 72 of User ceph-admin. Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.819 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.839 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.840 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd3fff07-b567-46f0-b9a1-2af36f4b50b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.820468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '80163ba8-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '26cff2f30de7f7572b90237e20c713c5a66333f9366aa13607791d8544e73668'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.820468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80164c9c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '5a5f3a5199239ad1724a285c0e7b2383678a98e4a6c588f843b248c1ebbc5529'}]}, 'timestamp': '2025-10-14 10:06:49.843760', '_unique_id': '1474307d61e743cfb3da24d5ab7a1fb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.844 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.847 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.850 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cb7527c-fcfd-4801-bede-a310eeced23f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.847289', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8017c7ac-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': '543d5aae4c4e5300ca96ce5df453e04eddbe691d513eb4c919f78720d434b131'}]}, 'timestamp': '2025-10-14 10:06:49.850711', '_unique_id': 'ce868fa770104bf79e7002f252e720f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.851 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.856 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0901a92-1af0-4154-81a0-a43db76e7e56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.856514', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8018bedc-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': '375ce797415495dacd260dd7b583646da7e0e0e36c689d41506587775e387121'}]}, 'timestamp': '2025-10-14 10:06:49.857003', '_unique_id': '6e3ef80ac3f7485fad943f78cfe47605'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.859 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.868 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.869 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e38055ec-3241-4937-a702-35ec59497d4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.859276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '801a9f0e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.051983152, 'message_signature': '671c3910d039e56f797bcff15291f0c4eb7adc9898ac0c8662964344b46455d6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.859276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '801aae0e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.051983152, 'message_signature': 'bd41baf72bf27a540fab6b9a6c977f918ed8559600c7d61ff304bf3af50e906b'}]}, 'timestamp': '2025-10-14 10:06:49.869617', '_unique_id': 'c8825db121d546438c44f7e2bf4d906e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.870 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.875 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f778625f-7faf-45ec-afb6-39efcde40107', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.875173', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '801bb4d4-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': 'b57d9b2b09a43cfb9c55ba81bda7e9fd8b1a30e4c7e7734102b212f4c5f603c5'}]}, 'timestamp': '2025-10-14 10:06:49.876381', '_unique_id': 'a45f3ae969e948368746b6bfd47e97ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.878 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '897cf74f-f8c5-4a6d-8718-36b8c25cfec7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.878705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '801c20f4-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.051983152, 'message_signature': 'f4094398fefb58daf1a4c73d66577011a258d6299a3bdae951a155c3c25fe249'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.878705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '801c3576-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.051983152, 'message_signature': 'c62bc20ac31830d38ca0b8f05064ed5bfa2e0ec82a3ec40d7ce2f4dd249d284e'}]}, 'timestamp': '2025-10-14 10:06:49.879658', '_unique_id': '68b63bf4deea4fa481bfab97c366978f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.881 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1895bbd6-fcc2-472b-8ee4-688f9465085a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.881959', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '801c9e8a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': '116fe30c71c154f808cd5ee2b68251986f0fa25977421d7304c24bda0271c51d'}]}, 'timestamp': '2025-10-14 10:06:49.882349', '_unique_id': '43a2c7c60d8e49e79f99a0d6ceae86b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.884 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.884 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.888 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4ddc112-b060-45f8-ac7a-c281135a5934', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.884264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '801d5370-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '34639a3ab1686da0cb53c91f2c6695f0f562977ab4d349022698d9e46aaf0b49'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.884264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '801d8f52-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': 'c72ae0b477959e009a8fe587609a4e7ceadde5be1a4830d6d696f64018cad808'}]}, 'timestamp': '2025-10-14 10:06:49.888493', '_unique_id': '616107fddc8f499aa683316b03f7651b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3e4f4d9-f5c0-4469-a3ff-5cdd8890f9b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.891390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '801e0f4a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '49ee222eed836002ab59cc046b39a9e52775c9369bb4f44adb73aeb2458abbf3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.891390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '801e1de6-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '629a914e59d4ebab012a22eb02f98a126698cc2f1092137c6ee250d797fb0be5'}]}, 'timestamp': '2025-10-14 10:06:49.892140', '_unique_id': '54d2fc3fa45d4bd082109f8c2d23f22f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4ce3adf-d735-4505-9d56-fe0484398f16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.894405', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '801e8510-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': '99de288a0a6b0d7150e5bd601e0655faa7a871db8f57aade5835a14ea8fb9fb4'}]}, 'timestamp': '2025-10-14 10:06:49.894842', '_unique_id': '9975cd75c9294c03a2dd7fc91d120a66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec5948e4-ae4e-4503-88ca-310e4ef19da3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.897187', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '801ef1f8-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': '18247ef6dd6578cfc87edb6b9325c2aefbb3675e3150839be4b15526374f2203'}]}, 'timestamp': '2025-10-14 10:06:49.897604', '_unique_id': 'a554b5a35d0f43f691bfb8068bb3e069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34293135-1270-4244-a53a-5064afb472a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.900006', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '801f6052-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': 'f9c829dba08600e290df55a510c091cc1c8443e3fbbc97c447fab57ff81754d0'}]}, 'timestamp': '2025-10-14 10:06:49.900439', '_unique_id': 'd5b7499381b0474f84d8e95c462b1e1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ee19082-5651-49a4-917c-ec4cc5d0adff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.902554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '801fc376-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '6ffc5f53ec98a77e2b849493118d9531a89dacf55136446e6d5bd436367ee3ff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.902554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '801fd0d2-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': 'e90a5afd68324a5dbb8ebc89418f3a29fd30878de1abef8f0ca2c08e9882a039'}]}, 'timestamp': '2025-10-14 10:06:49.903270', '_unique_id': '234537b4605348728f9b8e2d10880aa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.905 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.919 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 13760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fc80bb0-1ee0-417f-b785-094ca5fb68ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13760000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:06:49.905356', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '80226f2c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.112286839, 'message_signature': '123325cd9c19346eadd72e42b97d71a6ae4f838f034b091a07cc3b306a83c7e0'}]}, 'timestamp': '2025-10-14 10:06:49.920520', '_unique_id': 'f1ae010ff5624e59b69d7c12eec35d8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ab542db-1ab7-4abd-9560-b3a6d211d677', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.923095', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8022e678-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': 'df635befe9578e4924dd945303f906c20d6d89ffd0f4bc032e6a7b95ff8f11de'}]}, 'timestamp': '2025-10-14 10:06:49.923532', '_unique_id': 'd6cb6914fb9f4696ac6ea48785478fac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ef97d36-fc64-42a6-9a65-c9cdd4c526a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.926226', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '8023612a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': 'b63b669787632e5f223deea2f18c1f9b283d3a5c71fcc53873073afd76b00009'}]}, 'timestamp': '2025-10-14 10:06:49.926776', '_unique_id': '72a92b5286754286a4bbe7eb61f4d78c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7c642f0-9b02-486a-bf64-047f24a3434c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.930121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8023f806-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '3cb48252276e56231d4fb6ad4b2b95fad2281b615315810a993debdd283b7803'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.930121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80240602-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': 'fe70ee4bf13ac0d19a6bd4456079cc8a25f37638038877a395533d9152cb3611'}]}, 'timestamp': '2025-10-14 10:06:49.930850', '_unique_id': '16228aeae82f44e9a22af0998393dbe1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '567db7ba-8197-45d1-973f-f85f354b4030', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.932803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '802460b6-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '18e6293662c2b4ba9beb3a8b1c2884f69430ef3104a499ed3c41c6b719d0b68f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.932803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80246e12-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.013170323, 'message_signature': '5fcbcfb3b5ebf4e3bc9fae11128f660090797f92cdd94aa3534fcd531e9b1550'}]}, 'timestamp': '2025-10-14 10:06:49.933556', '_unique_id': '68dd472c576d4bc0abf6aea83fc9c38e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.935 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.935 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9751a464-f032-42b1-bea7-9d70bfbc956a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:06:49.935439', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8024c736-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.112286839, 'message_signature': 'f3ddf6faa28a1ec6a56f64246f15a569b5134e98d0867a7ca51c7b2d675bc183'}]}, 'timestamp': '2025-10-14 10:06:49.935821', '_unique_id': '4e774f2c52a64ce8867b372ee6dff46e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.936 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '047182df-180b-42b2-9245-e3946cd31380', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:06:49.938316', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '80253626-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.040002315, 'message_signature': 'f38258501a902d29d7be619e05e6173858a1fa3a9b513b0605ed88a4676bf7f3'}]}, 'timestamp': '2025-10-14 10:06:49.938657', '_unique_id': 'd9a27469cbf3400f973692aff5fa0ac8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.940 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.940 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3acb047b-f640-4f6e-8e0d-bc5c9ce7a3b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:06:49.940160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '80257d70-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.051983152, 'message_signature': '6655723e62d743d1268aed63622636e7f4d7e044f6cfc74973bafa36f1e54f1a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:06:49.940160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80258810-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12426.051983152, 'message_signature': '8a4aa2b887868f57521287d977f7c1941c7e3f9998c1d6220adc399354157ad0'}]}, 'timestamp': '2025-10-14 10:06:49.940744', '_unique_id': 'f23bb27dab5840b3b7a818845d40e520'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:06:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:06:49.942 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:06:49 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:06:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:50 localhost nova_compute[297686]: 2025-10-14 10:06:50.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:50 localhost ceph-mon[317114]: removing stray HostCache host record np0005486730.localdomain.devices.0 Oct 14 06:06:50 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:50 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:50 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:50 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:50 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:50 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:50 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:06:50 localhost systemd[1]: tmp-crun.TaDSMm.mount: Deactivated successfully. Oct 14 06:06:50 localhost podman[318228]: 2025-10-14 10:06:50.992562583 +0000 UTC m=+0.115387036 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public) Oct 14 06:06:51 localhost podman[318228]: 2025-10-14 10:06:51.102288144 +0000 UTC m=+0.225112607 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=) Oct 14 06:06:51 localhost ceph-mgr[302471]: [devicehealth INFO root] Check health Oct 14 06:06:51 localhost ceph-mon[317114]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 2 stray daemon(s) not managed by cephadm) Oct 14 06:06:51 localhost ceph-mon[317114]: Health check cleared: CEPHADM_STRAY_HOST (was: 2 stray host(s) with 2 daemon(s) not managed by cephadm) Oct 14 06:06:51 localhost ceph-mon[317114]: Cluster is now healthy Oct 14 06:06:51 localhost ceph-mgr[302471]: [cephadm INFO cherrypy.error] [14/Oct/2025:10:06:51] ENGINE Bus STARTING Oct 14 06:06:51 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : [14/Oct/2025:10:06:51] ENGINE Bus STARTING Oct 14 06:06:51 localhost ceph-mgr[302471]: [cephadm INFO cherrypy.error] [14/Oct/2025:10:06:51] ENGINE Serving on http://172.18.0.108:8765 Oct 14 06:06:51 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : [14/Oct/2025:10:06:51] ENGINE Serving on http://172.18.0.108:8765 Oct 14 06:06:51 localhost ceph-mgr[302471]: [cephadm INFO cherrypy.error] [14/Oct/2025:10:06:51] ENGINE Serving on https://172.18.0.108:7150 Oct 14 06:06:51 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : [14/Oct/2025:10:06:51] ENGINE Serving on https://172.18.0.108:7150 Oct 14 06:06:51 localhost ceph-mgr[302471]: [cephadm INFO cherrypy.error] [14/Oct/2025:10:06:51] ENGINE Bus STARTED Oct 14 06:06:51 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : [14/Oct/2025:10:06:51] ENGINE Bus STARTED Oct 14 06:06:51 localhost ceph-mgr[302471]: [cephadm INFO cherrypy.error] [14/Oct/2025:10:06:51] ENGINE Client ('172.18.0.108', 54724) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:06:51 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : [14/Oct/2025:10:06:51] ENGINE Client ('172.18.0.108', 54724) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:06:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:52 localhost ceph-mon[317114]: [14/Oct/2025:10:06:51] ENGINE Bus STARTING Oct 14 06:06:52 localhost ceph-mon[317114]: [14/Oct/2025:10:06:51] ENGINE Serving on http://172.18.0.108:8765 Oct 14 06:06:52 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:52 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:52 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:52 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:52 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:52 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:52 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:06:53 localhost nova_compute[297686]: 2025-10-14 10:06:53.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm INFO root] Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm INFO root] Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm INFO root] Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Oct 14 06:06:53 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:53 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:53 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:53 localhost ceph-mon[317114]: [14/Oct/2025:10:06:51] ENGINE Serving on https://172.18.0.108:7150 Oct 14 06:06:53 localhost ceph-mon[317114]: [14/Oct/2025:10:06:51] ENGINE Bus STARTED Oct 14 06:06:53 localhost ceph-mon[317114]: [14/Oct/2025:10:06:51] ENGINE Client ('172.18.0.108', 54724) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:06:53 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:06:53 localhost ceph-mgr[302471]: mgr.server handle_open ignoring open from mgr.np0005486732.pasqzz 172.18.0.107:0/3115874782; not ready for session (expect reconnect) Oct 14 06:06:54 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:54 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:54 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020038593 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:06:54 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:06:54 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:54 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:06:54 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:54 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:06:54 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:06:54 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:54 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:54 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:06:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005486732.pasqzz", "id": "np0005486732.pasqzz"} v 0) Oct 14 06:06:54 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr metadata", "who": "np0005486732.pasqzz", "id": "np0005486732.pasqzz"} : dispatch Oct 14 06:06:54 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:54 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:54 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:54 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:06:55 localhost nova_compute[297686]: 2025-10-14 10:06:55.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:55 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:55 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:55 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:55 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:55 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:06:55 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:55 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:55 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:55 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:06:56 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 0 B/s wr, 18 op/s Oct 14 06:06:56 localhost ceph-mgr[302471]: [progress INFO root] update: starting ev d0ddbc9d-99cf-4afe-8ecb-a907c5a0077a (Updating node-proxy deployment (+3 -> 3)) Oct 14 06:06:56 localhost ceph-mgr[302471]: [progress INFO root] complete: finished ev d0ddbc9d-99cf-4afe-8ecb-a907c5a0077a (Updating node-proxy deployment (+3 -> 3)) Oct 14 06:06:56 localhost ceph-mgr[302471]: [progress INFO root] Completed event d0ddbc9d-99cf-4afe-8ecb-a907c5a0077a (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Oct 14 06:06:56 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:56 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:56 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:06:56 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:06:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:06:56 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:06:56 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:06:56 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:06:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:57 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:57 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:06:57 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Oct 14 06:06:57 localhost ceph-mon[317114]: Health check failed: 3 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Oct 14 06:06:57 localhost ceph-mon[317114]: Health check failed: 3 stray host(s) with 3 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Oct 14 06:06:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Oct 14 06:06:57 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:06:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:06:57 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:06:57 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:06:57 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:06:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:06:57.775 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:06:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:06:57.776 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:06:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:06:57.777 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:06:58 localhost nova_compute[297686]: 2025-10-14 10:06:58.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:06:58 localhost podman[319164]: 2025-10-14 10:06:58.184004284 +0000 UTC m=+0.088026827 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:06:58 localhost podman[319164]: 2025-10-14 10:06:58.225283169 +0000 UTC m=+0.129305772 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:06:58 localhost podman[319163]: 2025-10-14 10:06:58.233572313 +0000 UTC m=+0.142176396 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal) Oct 14 06:06:58 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:06:58 localhost podman[319163]: 2025-10-14 10:06:58.249117789 +0000 UTC m=+0.157721882 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64) Oct 14 06:06:58 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:06:58 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 13 op/s Oct 14 06:06:58 localhost podman[248187]: time="2025-10-14T10:06:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:06:58 localhost podman[319162]: 2025-10-14 10:06:58.339962442 +0000 UTC m=+0.248076941 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:06:58 localhost podman[248187]: @ - - [14/Oct/2025:10:06:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:06:58 localhost podman[319162]: 2025-10-14 10:06:58.452062416 +0000 UTC m=+0.360176925 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:06:58 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:06:58 localhost podman[248187]: @ - - [14/Oct/2025:10:06:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19840 "" "Go-http-client/1.1" Oct 14 06:06:58 localhost ceph-mon[317114]: Reconfiguring daemon osd.2 on np0005486731.localdomain Oct 14 06:06:58 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:58 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:58 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:58 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:58 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:06:58 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:06:58 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:06:58 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:06:58 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:06:58 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:06:58 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:06:58 localhost ceph-mgr[302471]: [progress INFO root] Writing back 50 completed events Oct 14 06:06:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Oct 14 06:06:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054329 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:06:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:06:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:06:59 localhost ceph-mon[317114]: Reconfiguring daemon osd.4 on np0005486731.localdomain Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486732.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:59 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:06:59 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Oct 14 06:06:59 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Oct 14 06:06:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Oct 14 06:06:59 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:06:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:06:59 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:06:59 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:06:59 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:07:00 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s Oct 14 06:07:00 localhost nova_compute[297686]: 2025-10-14 10:07:00.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:00 localhost ceph-mgr[302471]: log_channel(audit) log [DBG] : from='client.64151 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Oct 14 06:07:00 localhost ceph-mon[317114]: Reconfiguring crash.np0005486732 (monmap changed)... Oct 14 06:07:00 localhost ceph-mon[317114]: Reconfiguring daemon crash.np0005486732 on np0005486732.localdomain Oct 14 06:07:00 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:00 localhost ceph-mon[317114]: Reconfiguring osd.1 (monmap changed)... Oct 14 06:07:00 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Oct 14 06:07:00 localhost ceph-mon[317114]: Reconfiguring daemon osd.1 on np0005486732.localdomain Oct 14 06:07:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:00 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Oct 14 06:07:00 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Oct 14 06:07:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Oct 14 06:07:00 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:07:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:00 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:00 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:07:00 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:07:01 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:01 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:01 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:01 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:01 localhost ceph-mon[317114]: Reconfiguring osd.5 (monmap changed)... Oct 14 06:07:01 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Oct 14 06:07:01 localhost ceph-mon[317114]: Reconfiguring daemon osd.5 on np0005486732.localdomain Oct 14 06:07:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:01 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:07:01 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:07:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:07:01 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:07:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:01 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:01 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:07:01 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:07:02 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s Oct 14 06:07:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:02 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:07:02 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:07:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:07:02 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:07:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Oct 14 06:07:02 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr services"} : dispatch Oct 14 06:07:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:02 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:02 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:07:02 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:07:02 localhost ceph-mon[317114]: Reconfiguring mds.mds.np0005486732.xkownj (monmap changed)... Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486732.xkownj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:07:02 localhost ceph-mon[317114]: Reconfiguring daemon mds.mds.np0005486732.xkownj on np0005486732.localdomain Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:07:02 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486732.pasqzz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:07:02 localhost ceph-mgr[302471]: log_channel(audit) log [DBG] : from='client.64154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Oct 14 06:07:02 localhost ceph-mgr[302471]: [cephadm INFO root] Saving service mon spec with placement label:mon Oct 14 06:07:02 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Oct 14 06:07:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:07:03 localhost nova_compute[297686]: 2025-10-14 10:07:03.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:03 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:07:03 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:07:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:07:03 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:07:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:07:03 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:07:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:03 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:03 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:07:03 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:07:03 localhost ceph-mon[317114]: Reconfiguring mgr.np0005486732.pasqzz (monmap changed)... Oct 14 06:07:03 localhost ceph-mon[317114]: Reconfiguring daemon mgr.np0005486732.pasqzz on np0005486732.localdomain Oct 14 06:07:03 localhost ceph-mon[317114]: Saving service mon spec with placement label:mon Oct 14 06:07:03 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:03 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:03 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:03 localhost ceph-mon[317114]: Reconfiguring mon.np0005486732 (monmap changed)... Oct 14 06:07:03 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:07:03 localhost ceph-mon[317114]: Reconfiguring daemon mon.np0005486732 on np0005486732.localdomain Oct 14 06:07:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054721 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain.devices.0}] v 0) Oct 14 06:07:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486732.localdomain}] v 0) Oct 14 06:07:04 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s Oct 14 06:07:04 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:07:04 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:07:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Oct 14 06:07:04 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:07:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:04 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:04 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:07:04 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:07:04 localhost ceph-mgr[302471]: log_channel(audit) log [DBG] : from='client.64160 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005486733", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Oct 14 06:07:04 localhost podman[319278]: Oct 14 06:07:04 localhost podman[319278]: 2025-10-14 10:07:04.950740527 +0000 UTC m=+0.086359477 container create 2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_meitner, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vcs-type=git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public) Oct 14 06:07:05 localhost systemd[1]: Started libpod-conmon-2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9.scope. Oct 14 06:07:05 localhost podman[319278]: 2025-10-14 10:07:04.911775023 +0000 UTC m=+0.047394013 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:07:05 localhost systemd[1]: Started libcrun container. Oct 14 06:07:05 localhost podman[319278]: 2025-10-14 10:07:05.034242434 +0000 UTC m=+0.169861384 container init 2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_meitner, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, name=rhceph, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git) Oct 14 06:07:05 localhost podman[319278]: 2025-10-14 10:07:05.045157168 +0000 UTC m=+0.180776108 container start 2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_meitner, description=Red Hat Ceph Storage 7, vcs-type=git, version=7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True) Oct 14 06:07:05 localhost podman[319278]: 2025-10-14 10:07:05.045407826 +0000 UTC m=+0.181026786 container attach 2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_meitner, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True) Oct 14 06:07:05 localhost trusting_meitner[319293]: 167 167 Oct 14 06:07:05 localhost systemd[1]: libpod-2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9.scope: Deactivated successfully. Oct 14 06:07:05 localhost podman[319278]: 2025-10-14 10:07:05.050073559 +0000 UTC m=+0.185692549 container died 2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_meitner, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553) Oct 14 06:07:05 localhost podman[319298]: 2025-10-14 10:07:05.141248052 +0000 UTC m=+0.080394234 container remove 2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_meitner, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, release=553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7) Oct 14 06:07:05 localhost systemd[1]: libpod-conmon-2f371dc3bdbae011aa4b17ca9dd02a250a5a838d0ab79704f2135648059b39d9.scope: Deactivated successfully. Oct 14 06:07:05 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:05 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:05 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Oct 14 06:07:05 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Oct 14 06:07:05 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Oct 14 06:07:05 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:07:05 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:05 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:05 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:07:05 localhost ceph-mon[317114]: Reconfiguring crash.np0005486733 (monmap changed)... Oct 14 06:07:05 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005486733.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Oct 14 06:07:05 localhost ceph-mon[317114]: Reconfiguring daemon crash.np0005486733 on np0005486733.localdomain Oct 14 06:07:05 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:05 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:05 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:05 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:07:05 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:07:05 localhost nova_compute[297686]: 2025-10-14 10:07:05.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:05 localhost podman[319369]: Oct 14 06:07:05 localhost podman[319369]: 2025-10-14 10:07:05.944717885 +0000 UTC m=+0.051997373 container create d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_torvalds, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Oct 14 06:07:05 localhost systemd[1]: var-lib-containers-storage-overlay-1d02c829170f3e17e959c9768c8fe55920c87a161d3c3ec3e9e11a666e5298cf-merged.mount: Deactivated successfully. Oct 14 06:07:05 localhost systemd[1]: Started libpod-conmon-d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc.scope. Oct 14 06:07:05 localhost systemd[1]: Started libcrun container. Oct 14 06:07:06 localhost podman[319369]: 2025-10-14 10:07:06.009311864 +0000 UTC m=+0.116591352 container init d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_torvalds, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553) Oct 14 06:07:06 localhost systemd[1]: tmp-crun.oYKCLM.mount: Deactivated successfully. Oct 14 06:07:06 localhost podman[319369]: 2025-10-14 10:07:05.923264929 +0000 UTC m=+0.030544447 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:07:06 localhost podman[319369]: 2025-10-14 10:07:06.023786388 +0000 UTC m=+0.131065926 container start d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_torvalds, distribution-scope=public, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Oct 14 06:07:06 localhost podman[319369]: 2025-10-14 10:07:06.024246962 +0000 UTC m=+0.131526640 container attach d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_torvalds, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, version=7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main) Oct 14 06:07:06 localhost eloquent_torvalds[319384]: 167 167 Oct 14 06:07:06 localhost systemd[1]: libpod-d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc.scope: Deactivated successfully. Oct 14 06:07:06 localhost podman[319369]: 2025-10-14 10:07:06.030122872 +0000 UTC m=+0.137402420 container died d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_torvalds, io.openshift.expose-services=, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , ceph=True, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph) Oct 14 06:07:06 localhost podman[319389]: 2025-10-14 10:07:06.119872411 +0000 UTC m=+0.077776044 container remove d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_torvalds, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, RELEASE=main, description=Red Hat Ceph Storage 7) Oct 14 06:07:06 localhost systemd[1]: libpod-conmon-d028ee72527d8e8d686d7a1e9dcdf0272410895d9d9f58e7949fe44cfe907bcc.scope: Deactivated successfully. Oct 14 06:07:06 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s Oct 14 06:07:06 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:06 localhost ceph-mon[317114]: Reconfiguring osd.0 (monmap changed)... Oct 14 06:07:06 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Oct 14 06:07:06 localhost ceph-mon[317114]: Reconfiguring daemon osd.0 on np0005486733.localdomain Oct 14 06:07:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:06 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Oct 14 06:07:06 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Oct 14 06:07:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Oct 14 06:07:06 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:07:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:06 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:06 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:07:06 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:07:06 localhost podman[319466]: Oct 14 06:07:06 localhost podman[319466]: 2025-10-14 10:07:06.928537494 +0000 UTC m=+0.069222642 container create befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_joliot, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_BRANCH=main) Oct 14 06:07:06 localhost systemd[1]: Started libpod-conmon-befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02.scope. Oct 14 06:07:06 localhost systemd[1]: var-lib-containers-storage-overlay-5f8101c5605c7293453344dae46038cbeed87282fa8d7fedd9b874a1af22f8fb-merged.mount: Deactivated successfully. Oct 14 06:07:06 localhost systemd[1]: Started libcrun container. Oct 14 06:07:06 localhost podman[319466]: 2025-10-14 10:07:06.986178609 +0000 UTC m=+0.126863737 container init befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_joliot, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public) Oct 14 06:07:06 localhost podman[319466]: 2025-10-14 10:07:06.896626656 +0000 UTC m=+0.037311834 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:07:07 localhost podman[319466]: 2025-10-14 10:07:07.001874091 +0000 UTC m=+0.142559199 container start befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_joliot, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, release=553) Oct 14 06:07:07 localhost tender_joliot[319481]: 167 167 Oct 14 06:07:07 localhost podman[319466]: 2025-10-14 10:07:07.002361205 +0000 UTC m=+0.143046373 container attach befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_joliot, release=553, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Oct 14 06:07:07 localhost systemd[1]: libpod-befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02.scope: Deactivated successfully. Oct 14 06:07:07 localhost podman[319466]: 2025-10-14 10:07:07.00772098 +0000 UTC m=+0.148406138 container died befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_joliot, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, ceph=True) Oct 14 06:07:07 localhost podman[319486]: 2025-10-14 10:07:07.081810059 +0000 UTC m=+0.066490017 container remove befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_joliot, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main) Oct 14 06:07:07 localhost systemd[1]: libpod-conmon-befa0e2bd8d6b74088a639d9b0213b0b7f406036bd59fd9040dfeadb44287a02.scope: Deactivated successfully. Oct 14 06:07:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:07 localhost ceph-mon[317114]: Reconfiguring osd.3 (monmap changed)... Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Oct 14 06:07:07 localhost ceph-mon[317114]: Reconfiguring daemon osd.3 on np0005486733.localdomain Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:07 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:07 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:07:07 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:07:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Oct 14 06:07:07 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:07:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:07 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:07 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:07:07 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:07:07 localhost systemd[1]: tmp-crun.2zDKpV.mount: Deactivated successfully. Oct 14 06:07:07 localhost systemd[1]: var-lib-containers-storage-overlay-2f126721ef9bbdcfd66230640b126b701f0edfff9ac09ad31070ecec0ef39ba2-merged.mount: Deactivated successfully. Oct 14 06:07:07 localhost podman[319563]: Oct 14 06:07:07 localhost podman[319563]: 2025-10-14 10:07:07.995781108 +0000 UTC m=+0.069567622 container create e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heisenberg, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=553, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64) Oct 14 06:07:08 localhost systemd[1]: Started libpod-conmon-e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189.scope. Oct 14 06:07:08 localhost systemd[1]: Started libcrun container. Oct 14 06:07:08 localhost podman[319563]: 2025-10-14 10:07:07.966720637 +0000 UTC m=+0.040507251 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:07:08 localhost podman[319563]: 2025-10-14 10:07:08.070877308 +0000 UTC m=+0.144663852 container init e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heisenberg, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, build-date=2025-09-24T08:57:55, version=7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux ) Oct 14 06:07:08 localhost podman[319563]: 2025-10-14 10:07:08.081027399 +0000 UTC m=+0.154813953 container start e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heisenberg, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vcs-type=git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=) Oct 14 06:07:08 localhost podman[319563]: 2025-10-14 10:07:08.081341278 +0000 UTC m=+0.155127832 container attach e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heisenberg, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, io.openshift.expose-services=) Oct 14 06:07:08 localhost relaxed_heisenberg[319578]: 167 167 Oct 14 06:07:08 localhost systemd[1]: libpod-e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189.scope: Deactivated successfully. Oct 14 06:07:08 localhost podman[319563]: 2025-10-14 10:07:08.085240998 +0000 UTC m=+0.159027532 container died e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heisenberg, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, ceph=True, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc.) Oct 14 06:07:08 localhost nova_compute[297686]: 2025-10-14 10:07:08.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:08 localhost podman[319583]: 2025-10-14 10:07:08.225438753 +0000 UTC m=+0.072687027 container remove e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heisenberg, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, release=553, vcs-type=git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Oct 14 06:07:08 localhost systemd[1]: libpod-conmon-e574adddeaac75fd584f3b5ba6317053ea1f1b26d94c8d3ea720fa03784c6189.scope: Deactivated successfully. Oct 14 06:07:08 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:08 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:07:08 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:07:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Oct 14 06:07:08 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:07:08 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:08 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:08 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:07:08 localhost ceph-mon[317114]: Reconfiguring mds.mds.np0005486733.tvstmf (monmap changed)... Oct 14 06:07:08 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005486733.tvstmf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Oct 14 06:07:08 localhost ceph-mon[317114]: Reconfiguring daemon mds.mds.np0005486733.tvstmf on np0005486733.localdomain Oct 14 06:07:08 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Oct 14 06:07:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "mgr services"} : dispatch Oct 14 06:07:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:08 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:07:08 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:07:08 localhost openstack_network_exporter[250374]: ERROR 10:07:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:07:08 localhost openstack_network_exporter[250374]: ERROR 10:07:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:07:08 localhost openstack_network_exporter[250374]: ERROR 10:07:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:07:08 localhost openstack_network_exporter[250374]: ERROR 10:07:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:07:08 localhost openstack_network_exporter[250374]: Oct 14 06:07:08 localhost openstack_network_exporter[250374]: ERROR 10:07:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:07:08 localhost openstack_network_exporter[250374]: Oct 14 06:07:08 localhost systemd[1]: var-lib-containers-storage-overlay-b33650a58afdd3dacffb0f7a53cdb7e7eee91af950947581ba375ba4029b775f-merged.mount: Deactivated successfully. Oct 14 06:07:09 localhost podman[319654]: Oct 14 06:07:09 localhost podman[319654]: 2025-10-14 10:07:09.014911557 +0000 UTC m=+0.085874741 container create c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_dewdney, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, ceph=True, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main) Oct 14 06:07:09 localhost systemd[1]: Started libpod-conmon-c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d.scope. Oct 14 06:07:09 localhost systemd[1]: Started libcrun container. Oct 14 06:07:09 localhost podman[319654]: 2025-10-14 10:07:08.982832815 +0000 UTC m=+0.053796059 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:07:09 localhost podman[319654]: 2025-10-14 10:07:09.081874169 +0000 UTC m=+0.152837403 container init c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_dewdney, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, version=7, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph) Oct 14 06:07:09 localhost podman[319654]: 2025-10-14 10:07:09.093209106 +0000 UTC m=+0.164172320 container start c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_dewdney, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, version=7, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:07:09 localhost podman[319654]: 2025-10-14 10:07:09.093647539 +0000 UTC m=+0.164610743 container attach c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_dewdney, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph) Oct 14 06:07:09 localhost pedantic_dewdney[319669]: 167 167 Oct 14 06:07:09 localhost systemd[1]: libpod-c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d.scope: Deactivated successfully. Oct 14 06:07:09 localhost podman[319654]: 2025-10-14 10:07:09.096727864 +0000 UTC m=+0.167691068 container died c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_dewdney, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:07:09 localhost podman[319675]: 2025-10-14 10:07:09.205010801 +0000 UTC m=+0.090878675 container remove c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_dewdney, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True) Oct 14 06:07:09 localhost systemd[1]: libpod-conmon-c614cea349f3a1cca66cc351b33ded72b41d9c5332f00d32dd80b82c70f5c74d.scope: Deactivated successfully. Oct 14 06:07:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:09 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005486733 (monmap changed)... Oct 14 06:07:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:07:09 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:07:09 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005486733 (monmap changed)... Oct 14 06:07:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:07:09 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:07:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:09 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:09 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:07:09 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:07:09 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:09 localhost ceph-mon[317114]: Reconfiguring mgr.np0005486733.primvu (monmap changed)... Oct 14 06:07:09 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:07:09 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005486733.primvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Oct 14 06:07:09 localhost ceph-mon[317114]: Reconfiguring daemon mgr.np0005486733.primvu on np0005486733.localdomain Oct 14 06:07:09 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:09 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:09 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:07:09 localhost systemd[1]: var-lib-containers-storage-overlay-70f911b2261c1afe348263d60644700f08fa5dd6b90ca9a739261824b2dfa554-merged.mount: Deactivated successfully. Oct 14 06:07:09 localhost podman[319744]: Oct 14 06:07:09 localhost podman[319744]: 2025-10-14 10:07:09.989721159 +0000 UTC m=+0.082503897 container create bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_mclaren, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhceph) Oct 14 06:07:10 localhost systemd[1]: Started libpod-conmon-bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929.scope. Oct 14 06:07:10 localhost systemd[1]: Started libcrun container. Oct 14 06:07:10 localhost podman[319744]: 2025-10-14 10:07:09.960250167 +0000 UTC m=+0.053032945 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:07:10 localhost podman[319744]: 2025-10-14 10:07:10.069410621 +0000 UTC m=+0.162193349 container init bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_mclaren, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, name=rhceph) Oct 14 06:07:10 localhost podman[319744]: 2025-10-14 10:07:10.080117049 +0000 UTC m=+0.172899777 container start bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_mclaren, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-type=git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container) Oct 14 06:07:10 localhost podman[319744]: 2025-10-14 10:07:10.08046262 +0000 UTC m=+0.173245418 container attach bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_mclaren, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, release=553, distribution-scope=public) Oct 14 06:07:10 localhost adoring_mclaren[319758]: 167 167 Oct 14 06:07:10 localhost systemd[1]: libpod-bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929.scope: Deactivated successfully. Oct 14 06:07:10 localhost podman[319744]: 2025-10-14 10:07:10.084936956 +0000 UTC m=+0.177719714 container died bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_mclaren, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux , release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=) Oct 14 06:07:10 localhost podman[319763]: 2025-10-14 10:07:10.183806396 +0000 UTC m=+0.085495981 container remove bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_mclaren, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, version=7, architecture=x86_64, vcs-type=git, release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Oct 14 06:07:10 localhost systemd[1]: libpod-conmon-bafa42c04ac055fa6b03539ea159833a2a7a9414b184ac2f1528e93e186f5929.scope: Deactivated successfully. Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain.devices.0}] v 0) Oct 14 06:07:10 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486733.localdomain}] v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Oct 14 06:07:10 localhost ceph-mgr[302471]: [progress INFO root] update: starting ev b7db3815-0559-4130-8df1-43873fa3e7a8 (Updating node-proxy deployment (+3 -> 3)) Oct 14 06:07:10 localhost ceph-mgr[302471]: [progress INFO root] complete: finished ev b7db3815-0559-4130-8df1-43873fa3e7a8 (Updating node-proxy deployment (+3 -> 3)) Oct 14 06:07:10 localhost ceph-mgr[302471]: [progress INFO root] Completed event b7db3815-0559-4130-8df1-43873fa3e7a8 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Oct 14 06:07:10 localhost ceph-mon[317114]: Reconfiguring mon.np0005486733 (monmap changed)... Oct 14 06:07:10 localhost ceph-mon[317114]: Reconfiguring daemon mon.np0005486733 on np0005486733.localdomain Oct 14 06:07:10 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:10 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:10 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:07:10 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:10 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:10 localhost nova_compute[297686]: 2025-10-14 10:07:10.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:10 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005486731 (monmap changed)... Oct 14 06:07:10 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005486731 (monmap changed)... Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Oct 14 06:07:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Oct 14 06:07:10 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch Oct 14 06:07:10 localhost ceph-mgr[302471]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:07:10 localhost ceph-mgr[302471]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:07:10 localhost systemd[1]: var-lib-containers-storage-overlay-b0a1bff6705f2ee6b36fc9a7c7d4e90d2d975ed8a2ed24e007ab7e2c5f3850c0-merged.mount: Deactivated successfully. Oct 14 06:07:11 localhost nova_compute[297686]: 2025-10-14 10:07:11.206 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:11 localhost nova_compute[297686]: 2025-10-14 10:07:11.232 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 06:07:11 localhost nova_compute[297686]: 2025-10-14 10:07:11.232 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:07:11 localhost nova_compute[297686]: 2025-10-14 10:07:11.233 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:07:11 localhost nova_compute[297686]: 2025-10-14 10:07:11.292 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:07:11 localhost ceph-mon[317114]: from='mgr.17433 172.18.0.108:0/2728758967' entity='mgr.np0005486733.primvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Oct 14 06:07:11 localhost systemd[1]: session-70.scope: Deactivated successfully. Oct 14 06:07:11 localhost systemd[1]: session-70.scope: Consumed 1.669s CPU time. Oct 14 06:07:11 localhost systemd-logind[760]: Session 70 logged out. Waiting for processes to exit. Oct 14 06:07:11 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain.devices.0}] v 0) Oct 14 06:07:11 localhost systemd-logind[760]: Removed session 70. Oct 14 06:07:11 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005486731.localdomain}] v 0) Oct 14 06:07:12 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:12 localhost ceph-mon[317114]: Reconfiguring mon.np0005486731 (monmap changed)... Oct 14 06:07:12 localhost ceph-mon[317114]: Reconfiguring daemon mon.np0005486731 on np0005486731.localdomain Oct 14 06:07:12 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:12 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:07:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:07:12 localhost podman[319800]: 2025-10-14 10:07:12.739701622 +0000 UTC m=+0.075279077 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:07:12 localhost podman[319800]: 2025-10-14 10:07:12.752015099 +0000 UTC m=+0.087592484 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:07:12 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:07:12 localhost podman[319799]: 2025-10-14 10:07:12.841875302 +0000 UTC m=+0.177034414 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:07:12 localhost podman[319799]: 2025-10-14 10:07:12.883232109 +0000 UTC m=+0.218391191 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:07:12 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:07:13 localhost nova_compute[297686]: 2025-10-14 10:07:13.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:14 localhost ceph-mgr[302471]: [progress INFO root] Writing back 50 completed events Oct 14 06:07:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.085723) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436434086148, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 13527, "num_deletes": 260, "total_data_size": 24129328, "memory_usage": 25088336, "flush_reason": "Manual Compaction"} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436434163892, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 18830728, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 13532, "table_properties": {"data_size": 18761703, "index_size": 37175, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30789, "raw_key_size": 328975, "raw_average_key_size": 26, "raw_value_size": 18553561, "raw_average_value_size": 1509, "num_data_blocks": 1410, "num_entries": 12295, "num_filter_entries": 12295, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436396, "oldest_key_time": 1760436396, "file_creation_time": 1760436434, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 77966 microseconds, and 37408 cpu microseconds. Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.163972) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 18830728 bytes OK Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.164000) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.165963) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.165991) EVENT_LOG_v1 {"time_micros": 1760436434165982, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.166013) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 24038555, prev total WAL file size 24038555, number of live WAL files 2. Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.169933) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(17MB) 8(1762B)] Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436434170041, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 18832490, "oldest_snapshot_seqno": -1} Oct 14 06:07:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 12044 keys, 18827110 bytes, temperature: kUnknown Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436434258371, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 18827110, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18758779, "index_size": 37126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 324132, "raw_average_key_size": 26, "raw_value_size": 18553966, "raw_average_value_size": 1540, "num_data_blocks": 1409, "num_entries": 12044, "num_filter_entries": 12044, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436434, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.258784) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 18827110 bytes Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.260582) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.9 rd, 212.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(18.0, 0.0 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 12300, records dropped: 256 output_compression: NoCompression Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.260622) EVENT_LOG_v1 {"time_micros": 1760436434260598, "job": 4, "event": "compaction_finished", "compaction_time_micros": 88466, "compaction_time_cpu_micros": 47393, "output_level": 6, "num_output_files": 1, "total_output_size": 18827110, "num_input_records": 12300, "num_output_records": 12044, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436434263861, "job": 4, "event": "table_file_deletion", "file_number": 14} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436434263939, "job": 4, "event": "table_file_deletion", "file_number": 8} Oct 14 06:07:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:07:14.169801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:07:14 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:15 localhost ceph-mon[317114]: from='mgr.17433 ' entity='mgr.np0005486733.primvu' Oct 14 06:07:15 localhost nova_compute[297686]: 2025-10-14 10:07:15.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:16 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:07:17 localhost podman[319843]: 2025-10-14 10:07:17.740223748 +0000 UTC m=+0.073912265 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid) Oct 14 06:07:17 localhost podman[319843]: 2025-10-14 10:07:17.754190346 +0000 UTC m=+0.087878903 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2) Oct 14 06:07:17 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:07:17 localhost systemd[1]: tmp-crun.nazPSS.mount: Deactivated successfully. Oct 14 06:07:17 localhost podman[319862]: 2025-10-14 10:07:17.866104244 +0000 UTC m=+0.076730631 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:07:17 localhost podman[319862]: 2025-10-14 10:07:17.871396756 +0000 UTC m=+0.082023143 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:07:17 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:07:17 localhost podman[319885]: 2025-10-14 10:07:17.963542509 +0000 UTC m=+0.074346709 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:07:17 localhost podman[319885]: 2025-10-14 10:07:17.978993072 +0000 UTC m=+0.089797232 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:07:17 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:07:18 localhost nova_compute[297686]: 2025-10-14 10:07:18.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:18 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:18 localhost nova_compute[297686]: 2025-10-14 10:07:18.278 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:19 localhost ceph-mgr[302471]: [volumes INFO mgr_util] scanning for idle connections.. Oct 14 06:07:19 localhost ceph-mgr[302471]: [volumes INFO mgr_util] cleaning up connections: [] Oct 14 06:07:19 localhost ceph-mgr[302471]: [volumes INFO mgr_util] scanning for idle connections.. Oct 14 06:07:19 localhost ceph-mgr[302471]: [volumes INFO mgr_util] cleaning up connections: [] Oct 14 06:07:19 localhost ceph-mgr[302471]: [volumes INFO mgr_util] scanning for idle connections.. Oct 14 06:07:19 localhost ceph-mgr[302471]: [volumes INFO mgr_util] cleaning up connections: [] Oct 14 06:07:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:19 localhost nova_compute[297686]: 2025-10-14 10:07:19.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:20 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:20 localhost nova_compute[297686]: 2025-10-14 10:07:20.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:21 localhost nova_compute[297686]: 2025-10-14 10:07:21.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:21 localhost nova_compute[297686]: 2025-10-14 10:07:21.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:07:21 localhost nova_compute[297686]: 2025-10-14 10:07:21.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:07:21 localhost systemd[1]: Stopping User Manager for UID 1003... Oct 14 06:07:21 localhost systemd[314478]: Activating special unit Exit the Session... Oct 14 06:07:21 localhost systemd[314478]: Stopped target Main User Target. Oct 14 06:07:21 localhost systemd[314478]: Stopped target Basic System. Oct 14 06:07:21 localhost systemd[314478]: Stopped target Paths. Oct 14 06:07:21 localhost systemd[314478]: Stopped target Sockets. Oct 14 06:07:21 localhost systemd[314478]: Stopped target Timers. Oct 14 06:07:21 localhost systemd[314478]: Stopped Mark boot as successful after the user session has run 2 minutes. Oct 14 06:07:21 localhost systemd[314478]: Stopped Daily Cleanup of User's Temporary Directories. Oct 14 06:07:21 localhost systemd[314478]: Closed D-Bus User Message Bus Socket. Oct 14 06:07:21 localhost systemd[314478]: Stopped Create User's Volatile Files and Directories. Oct 14 06:07:21 localhost systemd[314478]: Removed slice User Application Slice. Oct 14 06:07:21 localhost systemd[314478]: Reached target Shutdown. Oct 14 06:07:21 localhost systemd[314478]: Finished Exit the Session. Oct 14 06:07:21 localhost systemd[314478]: Reached target Exit the Session. Oct 14 06:07:21 localhost systemd[1]: user@1003.service: Deactivated successfully. Oct 14 06:07:21 localhost systemd[1]: Stopped User Manager for UID 1003. Oct 14 06:07:21 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Oct 14 06:07:21 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Oct 14 06:07:21 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Oct 14 06:07:21 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Oct 14 06:07:21 localhost systemd[1]: Removed slice User Slice of UID 1003. Oct 14 06:07:21 localhost systemd[1]: user-1003.slice: Consumed 2.277s CPU time. Oct 14 06:07:22 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.351 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.351 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.352 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.352 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.755 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.770 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.770 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.771 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.771 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:22 localhost nova_compute[297686]: 2025-10-14 10:07:22.771 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:07:23 localhost nova_compute[297686]: 2025-10-14 10:07:23.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:23 localhost nova_compute[297686]: 2025-10-14 10:07:23.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:23 localhost nova_compute[297686]: 2025-10-14 10:07:23.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:24 localhost nova_compute[297686]: 2025-10-14 10:07:24.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:24 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:25 localhost nova_compute[297686]: 2025-10-14 10:07:25.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:07:26 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.282 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.283 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.283 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.284 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.284 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:07:26 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:07:26 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2949507746' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.749 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.811 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.812 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.972 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.973 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11387MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.974 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:07:26 localhost nova_compute[297686]: 2025-10-14 10:07:26.974 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.064 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.064 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.065 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.110 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:07:27 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:07:27 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/385930853' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.558 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.565 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.591 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.593 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:07:27 localhost nova_compute[297686]: 2025-10-14 10:07:27.594 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:07:28 localhost nova_compute[297686]: 2025-10-14 10:07:28.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:28 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:28 localhost podman[248187]: time="2025-10-14T10:07:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:07:28 localhost podman[248187]: @ - - [14/Oct/2025:10:07:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:07:28 localhost podman[248187]: @ - - [14/Oct/2025:10:07:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19849 "" "Go-http-client/1.1" Oct 14 06:07:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:07:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:07:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:07:28 localhost podman[319947]: 2025-10-14 10:07:28.749308259 +0000 UTC m=+0.085734218 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:07:28 localhost podman[319947]: 2025-10-14 10:07:28.793502853 +0000 UTC m=+0.129928802 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Oct 14 06:07:28 localhost systemd[1]: tmp-crun.9VlBRC.mount: Deactivated successfully. Oct 14 06:07:28 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:07:28 localhost podman[319948]: 2025-10-14 10:07:28.863158716 +0000 UTC m=+0.198834102 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.) Oct 14 06:07:28 localhost podman[319949]: 2025-10-14 10:07:28.82770683 +0000 UTC m=+0.158539228 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm) Oct 14 06:07:28 localhost podman[319949]: 2025-10-14 10:07:28.911322262 +0000 UTC m=+0.242154700 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:07:28 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:07:28 localhost podman[319948]: 2025-10-14 10:07:28.931667695 +0000 UTC m=+0.267343051 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:07:28 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:07:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:30 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:30 localhost nova_compute[297686]: 2025-10-14 10:07:30.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:32 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:33 localhost nova_compute[297686]: 2025-10-14 10:07:33.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:34 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:35 localhost nova_compute[297686]: 2025-10-14 10:07:35.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:36 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:38 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:38 localhost nova_compute[297686]: 2025-10-14 10:07:38.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:38 localhost openstack_network_exporter[250374]: ERROR 10:07:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:07:38 localhost openstack_network_exporter[250374]: ERROR 10:07:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:07:38 localhost openstack_network_exporter[250374]: ERROR 10:07:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:07:38 localhost openstack_network_exporter[250374]: ERROR 10:07:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:07:38 localhost openstack_network_exporter[250374]: Oct 14 06:07:38 localhost openstack_network_exporter[250374]: ERROR 10:07:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:07:38 localhost openstack_network_exporter[250374]: Oct 14 06:07:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:40 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:40 localhost nova_compute[297686]: 2025-10-14 10:07:40.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:42 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:43 localhost nova_compute[297686]: 2025-10-14 10:07:43.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:07:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:07:43 localhost podman[320013]: 2025-10-14 10:07:43.739473616 +0000 UTC m=+0.079433015 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 06:07:43 localhost podman[320012]: 2025-10-14 10:07:43.791893061 +0000 UTC m=+0.132973644 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:07:43 localhost podman[320012]: 2025-10-14 10:07:43.795973217 +0000 UTC m=+0.137053780 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:07:43 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:07:43 localhost podman[320013]: 2025-10-14 10:07:43.825728579 +0000 UTC m=+0.165687968 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009) Oct 14 06:07:43 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:07:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:44 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:45 localhost nova_compute[297686]: 2025-10-14 10:07:45.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:46 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:47 localhost ceph-mgr[302471]: log_channel(audit) log [DBG] : from='client.64181 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Oct 14 06:07:48 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:48 localhost nova_compute[297686]: 2025-10-14 10:07:48.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:07:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:07:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:07:48 localhost podman[320054]: 2025-10-14 10:07:48.758917092 +0000 UTC m=+0.100440788 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:07:48 localhost podman[320055]: 2025-10-14 10:07:48.735412021 +0000 UTC m=+0.075988949 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:07:48 localhost podman[320054]: 2025-10-14 10:07:48.799067121 +0000 UTC m=+0.140590827 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 06:07:48 localhost podman[320056]: 2025-10-14 10:07:48.807463298 +0000 UTC m=+0.141479815 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:07:48 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:07:48 localhost podman[320055]: 2025-10-14 10:07:48.819232459 +0000 UTC m=+0.159809377 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:07:48 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:07:48 localhost podman[320056]: 2025-10-14 10:07:48.844198384 +0000 UTC m=+0.178214861 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:07:48 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:07:48 localhost ceph-mgr[302471]: [balancer INFO root] Optimize plan auto_2025-10-14_10:07:48 Oct 14 06:07:48 localhost ceph-mgr[302471]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Oct 14 06:07:48 localhost ceph-mgr[302471]: [balancer INFO root] do_upmap Oct 14 06:07:48 localhost ceph-mgr[302471]: [balancer INFO root] pools ['.mgr', 'manila_metadata', 'volumes', 'images', 'backups', 'vms', 'manila_data'] Oct 14 06:07:48 localhost ceph-mgr[302471]: [balancer INFO root] prepared 0/10 changes Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] _maybe_adjust Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325819636376326 of space, bias 1.0, pg target 0.6651639272752652 quantized to 32 (current 32) Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Oct 14 06:07:49 localhost ceph-mgr[302471]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Oct 14 06:07:49 localhost ceph-mgr[302471]: [volumes INFO mgr_util] scanning for idle connections.. Oct 14 06:07:49 localhost ceph-mgr[302471]: [volumes INFO mgr_util] cleaning up connections: [] Oct 14 06:07:49 localhost ceph-mgr[302471]: [volumes INFO mgr_util] scanning for idle connections.. Oct 14 06:07:49 localhost ceph-mgr[302471]: [volumes INFO mgr_util] cleaning up connections: [] Oct 14 06:07:49 localhost ceph-mgr[302471]: [volumes INFO mgr_util] scanning for idle connections.. Oct 14 06:07:49 localhost ceph-mgr[302471]: [volumes INFO mgr_util] cleaning up connections: [] Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: vms, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: volumes, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: images, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: backups, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: vms, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: volumes, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: images, start_after= Oct 14 06:07:49 localhost ceph-mgr[302471]: [rbd_support INFO root] load_schedules: backups, start_after= Oct 14 06:07:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:50 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:50 localhost nova_compute[297686]: 2025-10-14 10:07:50.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:52 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:53 localhost nova_compute[297686]: 2025-10-14 10:07:53.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) Oct 14 06:07:54 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2911511386' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch Oct 14 06:07:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:54 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:55 localhost nova_compute[297686]: 2025-10-14 10:07:55.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:56 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:07:57.776 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:07:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:07:57.777 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:07:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:07:57.777 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:07:58 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:07:58 localhost podman[248187]: time="2025-10-14T10:07:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:07:58 localhost podman[248187]: @ - - [14/Oct/2025:10:07:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:07:58 localhost podman[248187]: @ - - [14/Oct/2025:10:07:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19855 "" "Go-http-client/1.1" Oct 14 06:07:58 localhost nova_compute[297686]: 2025-10-14 10:07:58.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:07:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:07:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:07:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:07:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:07:59 localhost podman[320117]: 2025-10-14 10:07:59.757818361 +0000 UTC m=+0.093920369 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Oct 14 06:07:59 localhost podman[320117]: 2025-10-14 10:07:59.766544948 +0000 UTC m=+0.102646946 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:07:59 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:07:59 localhost podman[320116]: 2025-10-14 10:07:59.855439742 +0000 UTC m=+0.193480608 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal) Oct 14 06:07:59 localhost podman[320115]: 2025-10-14 10:07:59.908178016 +0000 UTC m=+0.248904455 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller) Oct 14 06:07:59 localhost podman[320116]: 2025-10-14 10:07:59.922877567 +0000 UTC m=+0.260918423 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64) Oct 14 06:07:59 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:07:59 localhost podman[320115]: 2025-10-14 10:07:59.971196007 +0000 UTC m=+0.311922486 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible) Oct 14 06:07:59 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:08:00 localhost ceph-mgr[302471]: log_channel(audit) log [DBG] : from='client.64199 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Oct 14 06:08:00 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:08:00 localhost nova_compute[297686]: 2025-10-14 10:08:00.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:02 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:08:03 localhost nova_compute[297686]: 2025-10-14 10:08:03.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:04 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:08:05 localhost nova_compute[297686]: 2025-10-14 10:08:05.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:06 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:08:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Oct 14 06:08:06 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2777076442' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch Oct 14 06:08:08 localhost ceph-mgr[302471]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail Oct 14 06:08:08 localhost nova_compute[297686]: 2025-10-14 10:08:08.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:08 localhost openstack_network_exporter[250374]: ERROR 10:08:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:08:08 localhost openstack_network_exporter[250374]: Oct 14 06:08:08 localhost openstack_network_exporter[250374]: ERROR 10:08:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:08:08 localhost openstack_network_exporter[250374]: ERROR 10:08:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:08:08 localhost openstack_network_exporter[250374]: Oct 14 06:08:08 localhost openstack_network_exporter[250374]: ERROR 10:08:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:08:08 localhost openstack_network_exporter[250374]: ERROR 10:08:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:08:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0) Oct 14 06:08:08 localhost ceph-mon[317114]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/2791224686' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:08:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 e84: 6 total, 6 up, 6 in Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr handle_mgr_map I was active but no longer am Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn e: '/usr/bin/ceph-mgr' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 0: '/usr/bin/ceph-mgr' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 1: '-n' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 2: 'mgr.np0005486733.primvu' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 3: '-f' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 4: '--setuser' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 5: 'ceph' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 6: '--setgroup' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 7: 'ceph' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 8: '--default-log-to-file=false' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 9: '--default-log-to-journald=true' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn 10: '--default-log-to-stderr=false' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn respawning with exe /usr/bin/ceph-mgr Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr respawn exe_path /proc/self/exe Oct 14 06:08:09 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:09.010+0000 7f1488b6d640 -1 mgr handle_mgr_map I was active but no longer am Oct 14 06:08:09 localhost systemd[1]: session-72.scope: Deactivated successfully. Oct 14 06:08:09 localhost systemd[1]: session-72.scope: Consumed 11.742s CPU time. Oct 14 06:08:09 localhost systemd-logind[760]: Session 72 logged out. Waiting for processes to exit. Oct 14 06:08:09 localhost systemd-logind[760]: Removed session 72. Oct 14 06:08:09 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: ignoring --setuser ceph since I am not root Oct 14 06:08:09 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: ignoring --setgroup ceph since I am not root Oct 14 06:08:09 localhost ceph-mgr[302471]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Oct 14 06:08:09 localhost ceph-mgr[302471]: pidfile_write: ignore empty --pid-file Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Loading python module 'alerts' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Module alerts has missing NOTIFY_TYPES member Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Loading python module 'balancer' Oct 14 06:08:09 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:09.211+0000 7f16228b1140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Oct 14 06:08:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Module balancer has missing NOTIFY_TYPES member Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Loading python module 'cephadm' Oct 14 06:08:09 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:09.278+0000 7f16228b1140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Oct 14 06:08:09 localhost sshd[320203]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:08:09 localhost systemd-logind[760]: New session 73 of user ceph-admin. Oct 14 06:08:09 localhost systemd[1]: Started Session 73 of User ceph-admin. Oct 14 06:08:09 localhost ceph-mon[317114]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:08:09 localhost ceph-mon[317114]: Activating manager daemon np0005486731.swasqz Oct 14 06:08:09 localhost ceph-mon[317114]: from='client.? 172.18.0.200:0/2791224686' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Oct 14 06:08:09 localhost ceph-mon[317114]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Oct 14 06:08:09 localhost ceph-mon[317114]: Manager daemon np0005486731.swasqz is now available Oct 14 06:08:09 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/mirror_snapshot_schedule"} : dispatch Oct 14 06:08:09 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005486731.swasqz/trash_purge_schedule"} : dispatch Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Loading python module 'crash' Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Module crash has missing NOTIFY_TYPES member Oct 14 06:08:09 localhost ceph-mgr[302471]: mgr[py] Loading python module 'dashboard' Oct 14 06:08:09 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:09.953+0000 7f16228b1140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost systemd[1]: tmp-crun.Z0I6Cf.mount: Deactivated successfully. Oct 14 06:08:10 localhost podman[320319]: 2025-10-14 10:08:10.428060412 +0000 UTC m=+0.102857442 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, architecture=x86_64) Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Loading python module 'devicehealth' Oct 14 06:08:10 localhost podman[320319]: 2025-10-14 10:08:10.521429282 +0000 UTC m=+0.196226382 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, distribution-scope=public, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True) Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Loading python module 'diskprediction_local' Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:10.525+0000 7f16228b1140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: from numpy import show_config as show_numpy_config Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:10.667+0000 7f16228b1140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Loading python module 'influx' Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Module influx has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:10.729+0000 7f16228b1140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Loading python module 'insights' Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Loading python module 'iostat' Oct 14 06:08:10 localhost nova_compute[297686]: 2025-10-14 10:08:10.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Module iostat has missing NOTIFY_TYPES member Oct 14 06:08:10 localhost ceph-mgr[302471]: mgr[py] Loading python module 'k8sevents' Oct 14 06:08:10 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:10.846+0000 7f16228b1140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mon[317114]: [14/Oct/2025:10:08:10] ENGINE Bus STARTING Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'localpool' Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'mds_autoscaler' Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'mirroring' Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'nfs' Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Module nfs has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'orchestrator' Oct 14 06:08:11 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:11.629+0000 7f16228b1140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'osd_perf_query' Oct 14 06:08:11 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:11.794+0000 7f16228b1140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'osd_support' Oct 14 06:08:11 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:11.860+0000 7f16228b1140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'pg_autoscaler' Oct 14 06:08:11 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:11.917+0000 7f16228b1140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:11.984+0000 7f16228b1140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Oct 14 06:08:11 localhost ceph-mgr[302471]: mgr[py] Loading python module 'progress' Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Module progress has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Loading python module 'prometheus' Oct 14 06:08:12 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:12.046+0000 7f16228b1140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mon[317114]: [14/Oct/2025:10:08:10] ENGINE Serving on http://172.18.0.106:8765 Oct 14 06:08:12 localhost ceph-mon[317114]: [14/Oct/2025:10:08:10] ENGINE Serving on https://172.18.0.106:7150 Oct 14 06:08:12 localhost ceph-mon[317114]: [14/Oct/2025:10:08:10] ENGINE Bus STARTED Oct 14 06:08:12 localhost ceph-mon[317114]: [14/Oct/2025:10:08:10] ENGINE Client ('172.18.0.106', 60988) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Oct 14 06:08:12 localhost ceph-mon[317114]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 3 stray daemon(s) not managed by cephadm) Oct 14 06:08:12 localhost ceph-mon[317114]: Health check cleared: CEPHADM_STRAY_HOST (was: 3 stray host(s) with 3 daemon(s) not managed by cephadm) Oct 14 06:08:12 localhost ceph-mon[317114]: Cluster is now healthy Oct 14 06:08:12 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:12 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:12 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:12 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:12 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:12 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Loading python module 'rbd_support' Oct 14 06:08:12 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:12.358+0000 7f16228b1140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Loading python module 'restful' Oct 14 06:08:12 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:12.439+0000 7f16228b1140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Loading python module 'rgw' Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Module rgw has missing NOTIFY_TYPES member Oct 14 06:08:12 localhost ceph-mgr[302471]: mgr[py] Loading python module 'rook' Oct 14 06:08:12 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:12.763+0000 7f16228b1140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Module rook has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'selftest' Oct 14 06:08:13 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:13.209+0000 7f16228b1140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:08:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Module selftest has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'snap_schedule' Oct 14 06:08:13 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:13.274+0000 7f16228b1140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'stats' Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'status' Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Module status has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'telegraf' Oct 14 06:08:13 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:13.475+0000 7f16228b1140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'telemetry' Oct 14 06:08:13 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:13.533+0000 7f16228b1140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost nova_compute[297686]: 2025-10-14 10:08:13.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'test_orchestrator' Oct 14 06:08:13 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:13.665+0000 7f16228b1140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost ceph-mgr[302471]: mgr[py] Loading python module 'volumes' Oct 14 06:08:13 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:13.812+0000 7f16228b1140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Oct 14 06:08:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:08:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:08:14 localhost ceph-mgr[302471]: mgr[py] Module volumes has missing NOTIFY_TYPES member Oct 14 06:08:14 localhost ceph-mgr[302471]: mgr[py] Loading python module 'zabbix' Oct 14 06:08:14 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:14.007+0000 7f16228b1140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Oct 14 06:08:14 localhost ceph-mgr[302471]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Oct 14 06:08:14 localhost ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-mgr-np0005486733-primvu[302467]: 2025-10-14T10:08:14.073+0000 7f16228b1140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Oct 14 06:08:14 localhost ceph-mgr[302471]: ms_deliver_dispatch: unhandled message 0x55f2d4fb71e0 mon_map magic: 0 from mon.2 v2:172.18.0.105:3300/0 Oct 14 06:08:14 localhost ceph-mgr[302471]: client.0 ms_handle_reset on v2:172.18.0.106:6810/1783812466 Oct 14 06:08:14 localhost podman[320845]: 2025-10-14 10:08:14.086158173 +0000 UTC m=+0.094573947 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:08:14 localhost podman[320845]: 2025-10-14 10:08:14.116063479 +0000 UTC m=+0.124479283 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:08:14 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:08:14 localhost podman[320844]: 2025-10-14 10:08:14.13077586 +0000 UTC m=+0.136763750 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:08:14 localhost podman[320844]: 2025-10-14 10:08:14.141646523 +0000 UTC m=+0.147634403 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:08:14 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:08:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:14 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:08:14 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:08:14 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:08:14 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:08:14 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:08:14 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:08:14 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/etc/ceph/ceph.conf Oct 14 06:08:14 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/etc/ceph/ceph.conf Oct 14 06:08:14 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/etc/ceph/ceph.conf Oct 14 06:08:14 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:08:14 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:08:14 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.conf Oct 14 06:08:15 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:08:15 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:08:15 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/etc/ceph/ceph.client.admin.keyring Oct 14 06:08:15 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:15 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:15 localhost nova_compute[297686]: 2025-10-14 10:08:15.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:16 localhost ceph-mon[317114]: Updating np0005486732.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:08:16 localhost ceph-mon[317114]: Updating np0005486731.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:08:16 localhost ceph-mon[317114]: Updating np0005486733.localdomain:/var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/config/ceph.client.admin.keyring Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:08:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:18 localhost nova_compute[297686]: 2025-10-14 10:08:18.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:08:19 localhost nova_compute[297686]: 2025-10-14 10:08:19.591 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:19 localhost nova_compute[297686]: 2025-10-14 10:08:19.591 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:08:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:08:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:08:19 localhost podman[321273]: 2025-10-14 10:08:19.740773057 +0000 UTC m=+0.083330524 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:08:19 localhost podman[321273]: 2025-10-14 10:08:19.753085923 +0000 UTC m=+0.095643390 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:08:19 localhost podman[321272]: 2025-10-14 10:08:19.712174111 +0000 UTC m=+0.060118683 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3) Oct 14 06:08:19 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:08:19 localhost podman[321272]: 2025-10-14 10:08:19.791211492 +0000 UTC m=+0.139156104 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:08:19 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:08:19 localhost podman[321274]: 2025-10-14 10:08:19.84436529 +0000 UTC m=+0.185385250 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:08:19 localhost podman[321274]: 2025-10-14 10:08:19.858114421 +0000 UTC m=+0.199134431 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid) Oct 14 06:08:19 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:08:20 localhost systemd[1]: tmp-crun.8lUYcL.mount: Deactivated successfully. Oct 14 06:08:20 localhost nova_compute[297686]: 2025-10-14 10:08:20.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:21 localhost nova_compute[297686]: 2025-10-14 10:08:21.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:21 localhost nova_compute[297686]: 2025-10-14 10:08:21.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.250 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.277 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.278 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.278 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.361 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.361 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.362 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.362 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.836 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.860 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:08:22 localhost nova_compute[297686]: 2025-10-14 10:08:22.860 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:08:23 localhost nova_compute[297686]: 2025-10-14 10:08:23.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:23 localhost nova_compute[297686]: 2025-10-14 10:08:23.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:24 localhost nova_compute[297686]: 2025-10-14 10:08:24.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:25 localhost nova_compute[297686]: 2025-10-14 10:08:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:25 localhost nova_compute[297686]: 2025-10-14 10:08:25.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:25 localhost nova_compute[297686]: 2025-10-14 10:08:25.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.339 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.339 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.340 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.340 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.340 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:08:27 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:08:27 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2944009308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.756 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.810 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:08:27 localhost nova_compute[297686]: 2025-10-14 10:08:27.810 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.026 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.028 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11419MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.028 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.029 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.082 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.083 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.083 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.118 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:08:28 localhost podman[248187]: time="2025-10-14T10:08:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:08:28 localhost podman[248187]: @ - - [14/Oct/2025:10:08:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:08:28 localhost podman[248187]: @ - - [14/Oct/2025:10:08:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19857 "" "Go-http-client/1.1" Oct 14 06:08:28 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:08:28 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2842275260' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.576 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.583 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.598 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.600 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:08:28 localhost nova_compute[297686]: 2025-10-14 10:08:28.600 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:08:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:08:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:08:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:08:30 localhost podman[321377]: 2025-10-14 10:08:30.750871569 +0000 UTC m=+0.072515363 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Oct 14 06:08:30 localhost podman[321376]: 2025-10-14 10:08:30.814309122 +0000 UTC m=+0.139790273 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, config_id=edpm, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc.) Oct 14 06:08:30 localhost podman[321376]: 2025-10-14 10:08:30.829052314 +0000 UTC m=+0.154533485 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal) Oct 14 06:08:30 localhost podman[321377]: 2025-10-14 10:08:30.838045629 +0000 UTC m=+0.159689423 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:08:30 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:08:30 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:08:30 localhost nova_compute[297686]: 2025-10-14 10:08:30.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:30 localhost podman[321375]: 2025-10-14 10:08:30.951802124 +0000 UTC m=+0.279709650 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller) Oct 14 06:08:30 localhost podman[321375]: 2025-10-14 10:08:30.984077672 +0000 UTC m=+0.311985148 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:08:30 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:08:33 localhost nova_compute[297686]: 2025-10-14 10:08:33.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:35 localhost nova_compute[297686]: 2025-10-14 10:08:35.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:38 localhost nova_compute[297686]: 2025-10-14 10:08:38.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:38 localhost openstack_network_exporter[250374]: ERROR 10:08:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:08:38 localhost openstack_network_exporter[250374]: ERROR 10:08:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:08:38 localhost openstack_network_exporter[250374]: ERROR 10:08:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:08:38 localhost openstack_network_exporter[250374]: ERROR 10:08:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:08:38 localhost openstack_network_exporter[250374]: Oct 14 06:08:38 localhost openstack_network_exporter[250374]: ERROR 10:08:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:08:38 localhost openstack_network_exporter[250374]: Oct 14 06:08:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:41 localhost nova_compute[297686]: 2025-10-14 10:08:41.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:43 localhost nova_compute[297686]: 2025-10-14 10:08:43.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:08:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:08:44 localhost podman[321434]: 2025-10-14 10:08:44.743558209 +0000 UTC m=+0.083436117 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:08:44 localhost podman[321434]: 2025-10-14 10:08:44.780030036 +0000 UTC m=+0.119907944 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:08:44 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:08:44 localhost podman[321435]: 2025-10-14 10:08:44.821879498 +0000 UTC m=+0.155415431 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:08:44 localhost podman[321435]: 2025-10-14 10:08:44.856132957 +0000 UTC m=+0.189668810 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:08:44 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:08:46 localhost nova_compute[297686]: 2025-10-14 10:08:46.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:48 localhost nova_compute[297686]: 2025-10-14 10:08:48.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.818 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.823 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '437c0452-3db9-4dea-bb91-b0f4bd5d6a5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.819956', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c79a4c44-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': 'b40562dec5e192079cacba4b800ebb2cc0cae9f93d0bb3bb584fb184117021c1'}]}, 'timestamp': '2025-10-14 10:08:49.824372', '_unique_id': '63050526234346d78240503a822b4ca4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.826 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.827 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.848 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.848 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33d8426b-0ed8-499a-8d45-5d9f7e42f102', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.827534', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c79e0ae6-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': 'f013f414af841e621d8353f3051313deb7fe963e22797ba7c21f97466fab945b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.827534', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c79e1ee6-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '1cfcd92a1b63f14b6d0647f290f91b7e6e9dcf44bdfcdc8c9bca9c733c103b19'}]}, 'timestamp': '2025-10-14 10:08:49.849328', '_unique_id': '1cd5a60be49a4d58a66a7dfc69ef9602'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.850 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.852 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.852 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.853 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b00ee98-4dc6-4811-8428-cb83fdaa9404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.852490', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c79eb1bc-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '907b625a4d9edb93cae523c98308ec9cef7d35dd19948a319859dd8d6533d630'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.852490', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c79ec292-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': 'c79f5c81ce1bb3e71e08a0f87a2d20808c2674f73e544f50badbdde493ebfb70'}]}, 'timestamp': '2025-10-14 10:08:49.853537', '_unique_id': '51d0ee0ab53040a1a40c599f47530f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.855 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.855 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68ce8c74-b9be-4fb1-9d60-b5b2894bdddc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.855909', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c79f3164-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': 'c3a9e0474d3219c5c13d0b630196edb0c3ce7d189a55193fb840167607c1d985'}]}, 'timestamp': '2025-10-14 10:08:49.856376', '_unique_id': 'aaab444831864bca866b1112b6e5c3d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.858 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.876 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '248ebf56-d6cb-43d9-89ee-79b7715bf7c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:08:49.858629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c7a26276-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.069157546, 'message_signature': '91d474920c46ba912c1e1fa02d184eb74cb807464e23feb66be20e4969a98254'}]}, 'timestamp': '2025-10-14 10:08:49.877356', '_unique_id': 'c539d5a0cbb34dbdb12b4cf03a6a8831'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.880 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1199041e-a860-479f-9f7e-24ea0a6cff0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.880281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7a2e9f8-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '74d99913eda130c0496c4994535dbd67db4d2406cedf0eaa87f03056751fbe99'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.880281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7a2fefc-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '7bf846ac27aaa8314a11e602c169d4f5115585234d4a2774f06b52532fd46249'}]}, 'timestamp': '2025-10-14 10:08:49.881277', '_unique_id': 'e6908f5aaa244827a10b8ea865824370'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.883 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3934900b-73db-4d10-bdcc-d4ad644be2ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.883874', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a37710-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': 'da5acbff6eabb5080da77e98fce179cb169d118a3708bf471c67c4543b5f4381'}]}, 'timestamp': '2025-10-14 10:08:49.884417', '_unique_id': '34d2fa22232f4ff8bdf8c0e41bc2399a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.887 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7037c7f-50df-4c8d-bf10-9df2f7f3c730', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.887558', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a40afe-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': '9bc82a3c6d59aad38bdab2dbb08d1df843b378c2f6dd77b459b46b5c1d9b3ccd'}]}, 'timestamp': '2025-10-14 10:08:49.888233', '_unique_id': '9a1e6843e79a4f6cb7887de68c5da6ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a554577-0448-41a8-bbf7-4e51d31922d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.890976', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a48ba0-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': 'fc0010179407773506f5cba1dbe8b2ab057f182253ff2f8752ecb0b673c01d96'}]}, 'timestamp': '2025-10-14 10:08:49.891463', '_unique_id': '4741eb61d0554512b2c0ab9669316b1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.893 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1038aee-4ac4-4822-bc80-e106af8f6ac9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.893924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7a50076-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': 'b8d27ac4b1bdf00479e987453e55182436e1b8213ec381dc75a7c8b54877a92f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.893924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7a5112e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '26d4ca7d7713cd41e9604ba764553cb2856ceb1aae64cdd3afa7d8ef9c4b855d'}]}, 'timestamp': '2025-10-14 10:08:49.894878', '_unique_id': '0dc5f9f2ea52430c8617a5693a4fa5be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b516e6f-477e-4f0f-a03c-ef39d7269b68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.897283', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7a583d4-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': 'b7f183dbc93872afbb62f1f718c988c06c992429dbfe4f0312dd1d2201390a8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.897283', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7a5961c-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '568bccb83fc82a4028878eaa9bc7a369584d1330077b7c5ff2e6382ae1b045e5'}]}, 'timestamp': '2025-10-14 10:08:49.898253', '_unique_id': '590d4dda10c1401ea29abfb8c120c039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '306af2f5-68f9-4d8d-a38a-aea5c4d3eee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.900615', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a605de-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': '74cafad832e51e56d5f2627d4707b2050cd936ec14ed28800e310e91ac9db88b'}]}, 'timestamp': '2025-10-14 10:08:49.901136', '_unique_id': 'fc077dff197f4674822818ae079dc775'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.913 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.913 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce213a5c-beb1-4607-9593-9878f19d3648', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.902769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7a7f1c8-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.095480523, 'message_signature': '631caceb927cbef4e27f05012537a502cc64ed78cc607e5c371c0395931cbab7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.902769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7a80028-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.095480523, 'message_signature': '1dea5180b49db597f686e6b0c33ac3c9ffbd7d013e53eee0726432a5dd6dee67'}]}, 'timestamp': '2025-10-14 10:08:49.914010', '_unique_id': 'fa10fc06f8554fd1a4f88c2f5c8f286e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.916 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f98196cf-df5d-4844-ab70-ee1dfc894e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.916109', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a85e38-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': 'f4aa5c73ed2ba36309ec0f1dad5c01ee666b245b6a92e80c797072b13ababa3e'}]}, 'timestamp': '2025-10-14 10:08:49.916420', '_unique_id': '6235a1b66e824ff9bfe4159ff6c3f010'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.917 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 14380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3cee61e-1147-4318-ab44-5d42809cb77c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14380000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:08:49.917997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c7a8a762-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.069157546, 'message_signature': 'a31014a728c6981e3501ba6b68051726415da05717a24237cfb1bb8d4499507f'}]}, 'timestamp': '2025-10-14 10:08:49.918284', '_unique_id': 'eb3f99a41a2d4cd28c4a84d2fc34dcb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.919 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2fb1dc4-9f8c-48cc-b66d-fadd48783fbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.919760', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a8ec5e-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': '700821d3bbac4c78f5e231af941578cc33b953653883d83d6208b98486f21173'}]}, 'timestamp': '2025-10-14 10:08:49.920082', '_unique_id': '02bffaf63bc34c4db2e41542cca08b91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62b80e96-f39a-4f81-a8a5-58bf29d40c9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.921468', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7a92f02-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': '47f66fb2239cd2d7e51aabe14995e6d3bef4b78300486bfa41852efbbed940e7'}]}, 'timestamp': '2025-10-14 10:08:49.921805', '_unique_id': '06dd9285042c496eaefe5978baa7959a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c0896ee-4503-4f75-b638-40f0e9fe4532', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.923208', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7a97340-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.095480523, 'message_signature': '55c598df43dd7e4865659e948ca5f1f9ef847fb23b8dbc9f3778f9fa76669338'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.923208', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7a97dc2-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.095480523, 'message_signature': '8bf7c61cb7caaeab65373021c537efa267aaedf104b726a448bf967025ccb780'}]}, 'timestamp': '2025-10-14 10:08:49.923787', '_unique_id': 'a4dbef540318455da5b54e00cc5e2b97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6160456-90fd-4db6-bb21-f41e277677aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.925289', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7a9c5ac-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '43e28c1c38c182a4aa8a9745f723399f9f37997e59e7e9bc03d9666345929a90'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.925289', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7a9d268-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.020256068, 'message_signature': '00c1c5ca8d593d5af530d8d6cbcb7de5bf731c3fefddf6292d6faaa565352671'}]}, 'timestamp': '2025-10-14 10:08:49.925935', '_unique_id': 'e0ea303518534a63842c3569414492b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.927 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9045f10-58ee-4b86-bffc-43310f555044', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:08:49.927333', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'c7aa157a-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.012655655, 'message_signature': '7e5913ddd379996e21816d44ebe2f7b21b2786ecb6cb84eb2e42b862afa565d6'}]}, 'timestamp': '2025-10-14 10:08:49.927663', '_unique_id': 'cf4581af907c4682b90c7129783fdd75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37d4d3e0-7441-430d-824f-8f523d0553ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:08:49.929060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7aa5760-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.095480523, 'message_signature': 'f77ca230dfd224290826dd743ca7f20b2435c65ceb505b615947a1242a34a220'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:08:49.929060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7aa62fa-a8e5-11f0-9707-fa163e99780b', 'monotonic_time': 12546.095480523, 'message_signature': 'f1da9882678e4777ff5408ee5e22af3cc27dac473ffbfed57bd5d9417f735253'}]}, 'timestamp': '2025-10-14 10:08:49.929653', '_unique_id': 'f9e334ce12b447ceaa9aae81200039a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:08:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:08:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 06:08:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:08:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:08:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:08:50 localhost podman[321475]: 2025-10-14 10:08:50.756889001 +0000 UTC m=+0.084975885 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid) Oct 14 06:08:50 localhost podman[321473]: 2025-10-14 10:08:50.805991125 +0000 UTC m=+0.143619681 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd) Oct 14 06:08:50 localhost podman[321473]: 2025-10-14 10:08:50.815703133 +0000 UTC m=+0.153331709 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:08:50 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:08:50 localhost podman[321474]: 2025-10-14 10:08:50.727179821 +0000 UTC m=+0.062777844 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:08:50 localhost podman[321475]: 2025-10-14 10:08:50.842250276 +0000 UTC m=+0.170337180 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:08:50 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:08:50 localhost podman[321474]: 2025-10-14 10:08:50.857616427 +0000 UTC m=+0.193214540 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:08:50 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:08:51 localhost nova_compute[297686]: 2025-10-14 10:08:51.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:53 localhost nova_compute[297686]: 2025-10-14 10:08:53.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:08:56 localhost nova_compute[297686]: 2025-10-14 10:08:56.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:08:57.778 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:08:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:08:57.778 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:08:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:08:57.779 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:08:58 localhost podman[248187]: time="2025-10-14T10:08:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:08:58 localhost podman[248187]: @ - - [14/Oct/2025:10:08:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:08:58 localhost podman[248187]: @ - - [14/Oct/2025:10:08:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19847 "" "Go-http-client/1.1" Oct 14 06:08:58 localhost nova_compute[297686]: 2025-10-14 10:08:58.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:08:58 localhost sshd[321534]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:08:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:01 localhost nova_compute[297686]: 2025-10-14 10:09:01.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:09:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:09:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:09:01 localhost podman[321536]: 2025-10-14 10:09:01.746498296 +0000 UTC m=+0.085502881 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:09:01 localhost podman[321537]: 2025-10-14 10:09:01.799237791 +0000 UTC m=+0.134078338 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Oct 14 06:09:01 localhost podman[321537]: 2025-10-14 10:09:01.814028545 +0000 UTC m=+0.148869092 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:09:01 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:09:01 localhost podman[321536]: 2025-10-14 10:09:01.835180042 +0000 UTC m=+0.174184687 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:09:01 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:09:01 localhost podman[321538]: 2025-10-14 10:09:01.90725564 +0000 UTC m=+0.239672033 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:09:01 localhost podman[321538]: 2025-10-14 10:09:01.921147286 +0000 UTC m=+0.253563699 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2) Oct 14 06:09:01 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:09:03 localhost nova_compute[297686]: 2025-10-14 10:09:03.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:06 localhost nova_compute[297686]: 2025-10-14 10:09:06.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:08 localhost openstack_network_exporter[250374]: ERROR 10:09:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:09:08 localhost openstack_network_exporter[250374]: ERROR 10:09:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:09:08 localhost openstack_network_exporter[250374]: ERROR 10:09:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:09:08 localhost openstack_network_exporter[250374]: ERROR 10:09:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:09:08 localhost openstack_network_exporter[250374]: Oct 14 06:09:08 localhost openstack_network_exporter[250374]: ERROR 10:09:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:09:08 localhost openstack_network_exporter[250374]: Oct 14 06:09:08 localhost nova_compute[297686]: 2025-10-14 10:09:08.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:11 localhost nova_compute[297686]: 2025-10-14 10:09:11.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:13 localhost nova_compute[297686]: 2025-10-14 10:09:13.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:09:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:09:15 localhost podman[321600]: 2025-10-14 10:09:15.740843837 +0000 UTC m=+0.079163356 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:09:15 localhost podman[321600]: 2025-10-14 10:09:15.775522269 +0000 UTC m=+0.113841808 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:09:15 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:09:15 localhost podman[321599]: 2025-10-14 10:09:15.780257365 +0000 UTC m=+0.120833023 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:09:15 localhost podman[321599]: 2025-10-14 10:09:15.860139692 +0000 UTC m=+0.200715370 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:09:15 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:09:16 localhost nova_compute[297686]: 2025-10-14 10:09:16.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:17 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:09:17 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:09:18 localhost nova_compute[297686]: 2025-10-14 10:09:18.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:09:19 localhost nova_compute[297686]: 2025-10-14 10:09:19.596 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:19 localhost nova_compute[297686]: 2025-10-14 10:09:19.597 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:21 localhost nova_compute[297686]: 2025-10-14 10:09:21.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:09:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:09:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:09:21 localhost podman[321726]: 2025-10-14 10:09:21.734165596 +0000 UTC m=+0.076284838 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:09:21 localhost podman[321727]: 2025-10-14 10:09:21.797588239 +0000 UTC m=+0.134533442 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:09:21 localhost podman[321727]: 2025-10-14 10:09:21.80902952 +0000 UTC m=+0.145974723 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:09:21 localhost podman[321726]: 2025-10-14 10:09:21.819111519 +0000 UTC m=+0.161230721 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Oct 14 06:09:21 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:09:21 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:09:21 localhost podman[321728]: 2025-10-14 10:09:21.764190016 +0000 UTC m=+0.096938890 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:09:21 localhost podman[321728]: 2025-10-14 10:09:21.901304927 +0000 UTC m=+0.234053861 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:09:21 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:09:22 localhost nova_compute[297686]: 2025-10-14 10:09:22.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:22 localhost nova_compute[297686]: 2025-10-14 10:09:22.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.406 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.407 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.407 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.408 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.768 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.784 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.784 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.785 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:23 localhost nova_compute[297686]: 2025-10-14 10:09:23.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:24 localhost nova_compute[297686]: 2025-10-14 10:09:24.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:25 localhost nova_compute[297686]: 2025-10-14 10:09:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:26 localhost nova_compute[297686]: 2025-10-14 10:09:26.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.273 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.274 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.274 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.274 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.275 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:09:27 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:09:27 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/8682216' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.727 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.802 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:09:27 localhost nova_compute[297686]: 2025-10-14 10:09:27.802 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:27.999 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.001 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11410MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.002 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.002 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.074 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.074 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.075 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.122 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:09:28 localhost podman[248187]: time="2025-10-14T10:09:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:09:28 localhost podman[248187]: @ - - [14/Oct/2025:10:09:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:09:28 localhost podman[248187]: @ - - [14/Oct/2025:10:09:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19854 "" "Go-http-client/1.1" Oct 14 06:09:28 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:09:28 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/508173685' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.615 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.621 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.635 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.637 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.637 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:09:28 localhost nova_compute[297686]: 2025-10-14 10:09:28.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:31 localhost nova_compute[297686]: 2025-10-14 10:09:31.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:09:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:09:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:09:32 localhost podman[321832]: 2025-10-14 10:09:32.76854431 +0000 UTC m=+0.104181761 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, release=1755695350) Oct 14 06:09:32 localhost podman[321832]: 2025-10-14 10:09:32.779214358 +0000 UTC m=+0.114851759 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:09:32 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:09:32 localhost podman[321831]: 2025-10-14 10:09:32.865836319 +0000 UTC m=+0.205457022 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:09:32 localhost podman[321833]: 2025-10-14 10:09:32.830409781 +0000 UTC m=+0.163318788 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute) Oct 14 06:09:32 localhost podman[321831]: 2025-10-14 10:09:32.90392082 +0000 UTC m=+0.243541553 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3) Oct 14 06:09:32 localhost podman[321833]: 2025-10-14 10:09:32.918127956 +0000 UTC m=+0.251037003 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:09:32 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:09:32 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:09:33 localhost nova_compute[297686]: 2025-10-14 10:09:33.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:09:34.552 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:09:34 localhost nova_compute[297686]: 2025-10-14 10:09:34.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:09:34.553 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:09:36 localhost nova_compute[297686]: 2025-10-14 10:09:36.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:38 localhost openstack_network_exporter[250374]: ERROR 10:09:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:09:38 localhost openstack_network_exporter[250374]: ERROR 10:09:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:09:38 localhost openstack_network_exporter[250374]: ERROR 10:09:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:09:38 localhost openstack_network_exporter[250374]: ERROR 10:09:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:09:38 localhost openstack_network_exporter[250374]: Oct 14 06:09:38 localhost openstack_network_exporter[250374]: ERROR 10:09:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:09:38 localhost openstack_network_exporter[250374]: Oct 14 06:09:38 localhost nova_compute[297686]: 2025-10-14 10:09:38.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:41 localhost nova_compute[297686]: 2025-10-14 10:09:41.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e85 e85: 6 total, 6 up, 6 in Oct 14 06:09:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 e86: 6 total, 6 up, 6 in Oct 14 06:09:43 localhost ovn_metadata_agent[163050]: 2025-10-14 10:09:43.555 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:09:43 localhost nova_compute[297686]: 2025-10-14 10:09:43.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:46 localhost nova_compute[297686]: 2025-10-14 10:09:46.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:09:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:09:46 localhost systemd[1]: tmp-crun.LlAGa2.mount: Deactivated successfully. Oct 14 06:09:46 localhost podman[321897]: 2025-10-14 10:09:46.760715202 +0000 UTC m=+0.096446654 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:09:46 localhost podman[321896]: 2025-10-14 10:09:46.80198356 +0000 UTC m=+0.139066884 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:09:46 localhost podman[321896]: 2025-10-14 10:09:46.835413147 +0000 UTC m=+0.172496451 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:09:46 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:09:46 localhost podman[321897]: 2025-10-14 10:09:46.891550551 +0000 UTC m=+0.227282003 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:09:46 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:09:48 localhost nova_compute[297686]: 2025-10-14 10:09:48.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:51 localhost nova_compute[297686]: 2025-10-14 10:09:51.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:09:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:09:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:09:52 localhost systemd[1]: tmp-crun.em1gtw.mount: Deactivated successfully. Oct 14 06:09:52 localhost podman[321939]: 2025-10-14 10:09:52.737430463 +0000 UTC m=+0.072072075 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:09:52 localhost podman[321940]: 2025-10-14 10:09:52.744608183 +0000 UTC m=+0.071631381 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Oct 14 06:09:52 localhost podman[321939]: 2025-10-14 10:09:52.752188756 +0000 UTC m=+0.086830368 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:09:52 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:09:52 localhost podman[321940]: 2025-10-14 10:09:52.784134177 +0000 UTC m=+0.111157455 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:09:52 localhost podman[321938]: 2025-10-14 10:09:52.844669497 +0000 UTC m=+0.178579367 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 06:09:52 localhost podman[321938]: 2025-10-14 10:09:52.858042898 +0000 UTC m=+0.191952768 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=multipathd, org.label-schema.schema-version=1.0) Oct 14 06:09:52 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:09:52 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:09:53 localhost nova_compute[297686]: 2025-10-14 10:09:53.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:09:56 localhost nova_compute[297686]: 2025-10-14 10:09:56.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:09:57.779 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:09:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:09:57.780 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:09:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:09:57.780 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:09:58 localhost podman[248187]: time="2025-10-14T10:09:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:09:58 localhost podman[248187]: @ - - [14/Oct/2025:10:09:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:09:58 localhost podman[248187]: @ - - [14/Oct/2025:10:09:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19855 "" "Go-http-client/1.1" Oct 14 06:09:59 localhost nova_compute[297686]: 2025-10-14 10:09:59.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:09:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:00 localhost ceph-mon[317114]: overall HEALTH_OK Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.453054) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436600453097, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2498, "num_deletes": 255, "total_data_size": 5761757, "memory_usage": 6041920, "flush_reason": "Manual Compaction"} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436600472335, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3712266, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13537, "largest_seqno": 16030, "table_properties": {"data_size": 3703076, "index_size": 5695, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20778, "raw_average_key_size": 21, "raw_value_size": 3683929, "raw_average_value_size": 3790, "num_data_blocks": 244, "num_entries": 972, "num_filter_entries": 972, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436434, "oldest_key_time": 1760436434, "file_creation_time": 1760436600, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 19351 microseconds, and 6658 cpu microseconds. Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.472398) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3712266 bytes OK Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.472425) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.474498) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.474525) EVENT_LOG_v1 {"time_micros": 1760436600474518, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.474549) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5750472, prev total WAL file size 5750472, number of live WAL files 2. Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.475802) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3625KB)], [15(17MB)] Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436600475859, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 22539376, "oldest_snapshot_seqno": -1} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 12476 keys, 19266081 bytes, temperature: kUnknown Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436600559644, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 19266081, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19194492, "index_size": 39296, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31237, "raw_key_size": 333994, "raw_average_key_size": 26, "raw_value_size": 18981654, "raw_average_value_size": 1521, "num_data_blocks": 1501, "num_entries": 12476, "num_filter_entries": 12476, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436600, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.560022) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 19266081 bytes Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.562063) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 268.5 rd, 229.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 18.0 +0.0 blob) out(18.4 +0.0 blob), read-write-amplify(11.3) write-amplify(5.2) OK, records in: 13016, records dropped: 540 output_compression: NoCompression Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.562094) EVENT_LOG_v1 {"time_micros": 1760436600562081, "job": 6, "event": "compaction_finished", "compaction_time_micros": 83932, "compaction_time_cpu_micros": 47703, "output_level": 6, "num_output_files": 1, "total_output_size": 19266081, "num_input_records": 13016, "num_output_records": 12476, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436600562740, "job": 6, "event": "table_file_deletion", "file_number": 17} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436600565441, "job": 6, "event": "table_file_deletion", "file_number": 15} Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.475663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.565548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.565555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.565557) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.565558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:00 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:00.565560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:01 localhost nova_compute[297686]: 2025-10-14 10:10:01.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:10:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:10:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:10:03 localhost podman[321998]: 2025-10-14 10:10:03.747034724 +0000 UTC m=+0.088264533 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.build-date=20251009, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:10:03 localhost podman[321998]: 2025-10-14 10:10:03.791154239 +0000 UTC m=+0.132384048 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Oct 14 06:10:03 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:10:03 localhost podman[322000]: 2025-10-14 10:10:03.807201332 +0000 UTC m=+0.140904840 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 06:10:03 localhost podman[321999]: 2025-10-14 10:10:03.853573527 +0000 UTC m=+0.190208314 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:10:03 localhost podman[321999]: 2025-10-14 10:10:03.871321292 +0000 UTC m=+0.207956039 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter) Oct 14 06:10:03 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:10:03 localhost podman[322000]: 2025-10-14 10:10:03.92073224 +0000 UTC m=+0.254435698 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Oct 14 06:10:03 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:10:04 localhost nova_compute[297686]: 2025-10-14 10:10:04.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:06 localhost nova_compute[297686]: 2025-10-14 10:10:06.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:08 localhost ovn_controller[157396]: 2025-10-14T10:10:08Z|00066|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory Oct 14 06:10:08 localhost openstack_network_exporter[250374]: ERROR 10:10:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:10:08 localhost openstack_network_exporter[250374]: ERROR 10:10:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:10:08 localhost openstack_network_exporter[250374]: ERROR 10:10:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:10:08 localhost openstack_network_exporter[250374]: ERROR 10:10:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:10:08 localhost openstack_network_exporter[250374]: Oct 14 06:10:08 localhost openstack_network_exporter[250374]: ERROR 10:10:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:10:08 localhost openstack_network_exporter[250374]: Oct 14 06:10:09 localhost nova_compute[297686]: 2025-10-14 10:10:09.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:11 localhost nova_compute[297686]: 2025-10-14 10:10:11.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:14 localhost nova_compute[297686]: 2025-10-14 10:10:14.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:16 localhost nova_compute[297686]: 2025-10-14 10:10:16.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:10:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:10:17 localhost podman[322083]: 2025-10-14 10:10:17.432587744 +0000 UTC m=+0.099383104 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent) Oct 14 06:10:17 localhost podman[322081]: 2025-10-14 10:10:17.396212527 +0000 UTC m=+0.068796625 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:10:17 localhost podman[322083]: 2025-10-14 10:10:17.461855823 +0000 UTC m=+0.128651203 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 06:10:17 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:10:17 localhost podman[322081]: 2025-10-14 10:10:17.480157115 +0000 UTC m=+0.152741273 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:10:17 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:10:18 localhost nova_compute[297686]: 2025-10-14 10:10:18.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:18.155 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:10:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:18.158 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:10:18 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:10:18 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:10:19 localhost nova_compute[297686]: 2025-10-14 10:10:19.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.462483) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436619462524, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 471, "num_deletes": 250, "total_data_size": 343814, "memory_usage": 354232, "flush_reason": "Manual Compaction"} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436619466804, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 225050, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16035, "largest_seqno": 16501, "table_properties": {"data_size": 222628, "index_size": 533, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5441, "raw_average_key_size": 16, "raw_value_size": 217667, "raw_average_value_size": 667, "num_data_blocks": 24, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436601, "oldest_key_time": 1760436601, "file_creation_time": 1760436619, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 4379 microseconds, and 1708 cpu microseconds. Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.466857) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 225050 bytes OK Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.466883) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.468562) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.468586) EVENT_LOG_v1 {"time_micros": 1760436619468579, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.468611) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 340939, prev total WAL file size 340939, number of live WAL files 2. Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.469415) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353139' seq:72057594037927935, type:22 .. '6B760031373730' seq:0, type:0; will stop at (end) Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(219KB)], [18(18MB)] Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436619469474, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19491131, "oldest_snapshot_seqno": -1} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 12286 keys, 18451200 bytes, temperature: kUnknown Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436619566035, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 18451200, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18381950, "index_size": 37469, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30725, "raw_key_size": 331487, "raw_average_key_size": 26, "raw_value_size": 18173337, "raw_average_value_size": 1479, "num_data_blocks": 1407, "num_entries": 12286, "num_filter_entries": 12286, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436619, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.566487) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 18451200 bytes Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.568064) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 201.6 rd, 190.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.4 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(168.6) write-amplify(82.0) OK, records in: 12802, records dropped: 516 output_compression: NoCompression Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.568104) EVENT_LOG_v1 {"time_micros": 1760436619568088, "job": 8, "event": "compaction_finished", "compaction_time_micros": 96674, "compaction_time_cpu_micros": 55716, "output_level": 6, "num_output_files": 1, "total_output_size": 18451200, "num_input_records": 12802, "num_output_records": 12286, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436619568401, "job": 8, "event": "table_file_deletion", "file_number": 20} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436619571345, "job": 8, "event": "table_file_deletion", "file_number": 18} Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.469297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.571482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.571488) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.571490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.571492) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:10:19.571494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:10:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:10:21 localhost nova_compute[297686]: 2025-10-14 10:10:21.634 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:21 localhost nova_compute[297686]: 2025-10-14 10:10:21.635 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:21 localhost nova_compute[297686]: 2025-10-14 10:10:21.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:22 localhost nova_compute[297686]: 2025-10-14 10:10:22.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:22 localhost nova_compute[297686]: 2025-10-14 10:10:22.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.345 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.345 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.346 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.346 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:10:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:10:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:10:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:10:23 localhost podman[322188]: 2025-10-14 10:10:23.765655661 +0000 UTC m=+0.089360546 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:10:23 localhost podman[322188]: 2025-10-14 10:10:23.772004966 +0000 UTC m=+0.095709861 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:10:23 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:10:23 localhost podman[322187]: 2025-10-14 10:10:23.81055328 +0000 UTC m=+0.136173724 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0) Oct 14 06:10:23 localhost podman[322187]: 2025-10-14 10:10:23.817520734 +0000 UTC m=+0.143141188 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd) Oct 14 06:10:23 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.854 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:10:23 localhost podman[322189]: 2025-10-14 10:10:23.870842522 +0000 UTC m=+0.188476061 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=iscsid) Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.872 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:10:23 localhost nova_compute[297686]: 2025-10-14 10:10:23.873 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:10:23 localhost podman[322189]: 2025-10-14 10:10:23.877531467 +0000 UTC m=+0.195164996 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:10:23 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:10:24 localhost nova_compute[297686]: 2025-10-14 10:10:24.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:25 localhost nova_compute[297686]: 2025-10-14 10:10:25.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:25 localhost nova_compute[297686]: 2025-10-14 10:10:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:25 localhost nova_compute[297686]: 2025-10-14 10:10:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:26 localhost nova_compute[297686]: 2025-10-14 10:10:26.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:26 localhost nova_compute[297686]: 2025-10-14 10:10:26.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:27 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:27.161 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.275 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.276 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.276 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.276 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.277 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:10:27 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:10:27 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3477406920' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.693 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.759 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.760 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.981 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.983 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11406MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.984 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:10:27 localhost nova_compute[297686]: 2025-10-14 10:10:27.984 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.090 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.090 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.091 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.126 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:10:28 localhost podman[248187]: time="2025-10-14T10:10:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:10:28 localhost podman[248187]: @ - - [14/Oct/2025:10:10:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:10:28 localhost podman[248187]: @ - - [14/Oct/2025:10:10:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19859 "" "Go-http-client/1.1" Oct 14 06:10:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:28.574 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:28Z, description=, device_id=3f1dc2f3-1eae-461b-9a32-ecb7d3d13915, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ff0a5784-057b-403d-80dc-8bc17663b0d4, ip_allocation=immediate, mac_address=fa:16:3e:38:45:b3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=193, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:10:28Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:10:28 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:10:28 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3445580418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.596 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.603 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.633 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.636 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:10:28 localhost nova_compute[297686]: 2025-10-14 10:10:28.637 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:10:28 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:10:28 localhost podman[322306]: 2025-10-14 10:10:28.785982543 +0000 UTC m=+0.045949204 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:10:28 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:10:28 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:10:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:28.995 271987 INFO neutron.agent.dhcp.agent [None req-46a2f823-2ff4-48e2-b176-bbde695fe70f - - - - - -] DHCP configuration for ports {'ff0a5784-057b-403d-80dc-8bc17663b0d4'} is completed#033[00m Oct 14 06:10:29 localhost nova_compute[297686]: 2025-10-14 10:10:29.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:29 localhost nova_compute[297686]: 2025-10-14 10:10:29.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:29 localhost nova_compute[297686]: 2025-10-14 10:10:29.637 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:10:31 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:31.274 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:31Z, description=, device_id=556acacf-a623-4c83-8f30-47e4c7fdd166, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca26f015-4694-433b-8a8a-7d74a2706d9d, ip_allocation=immediate, mac_address=fa:16:3e:cb:9e:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=204, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:10:31Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:10:31 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:10:31 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:10:31 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:10:31 localhost podman[322343]: 2025-10-14 10:10:31.501167511 +0000 UTC m=+0.060303124 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:10:31 localhost nova_compute[297686]: 2025-10-14 10:10:31.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:32.206 271987 INFO neutron.agent.dhcp.agent [None req-43c84101-0c14-433f-8046-a3546e769f37 - - - - - -] DHCP configuration for ports {'ca26f015-4694-433b-8a8a-7d74a2706d9d'} is completed#033[00m Oct 14 06:10:32 localhost nova_compute[297686]: 2025-10-14 10:10:32.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:34 localhost nova_compute[297686]: 2025-10-14 10:10:34.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:34 localhost nova_compute[297686]: 2025-10-14 10:10:34.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:10:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:10:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:10:34 localhost podman[322364]: 2025-10-14 10:10:34.760534167 +0000 UTC m=+0.090454870 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git) Oct 14 06:10:34 localhost podman[322363]: 2025-10-14 10:10:34.803097205 +0000 UTC m=+0.136590547 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller) Oct 14 06:10:34 localhost systemd[1]: tmp-crun.X4x87p.mount: Deactivated successfully. Oct 14 06:10:34 localhost podman[322365]: 2025-10-14 10:10:34.867525944 +0000 UTC m=+0.193980741 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute) Oct 14 06:10:34 localhost podman[322363]: 2025-10-14 10:10:34.87230914 +0000 UTC m=+0.205802442 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:10:34 localhost podman[322364]: 2025-10-14 10:10:34.875256211 +0000 UTC m=+0.205176894 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.) Oct 14 06:10:34 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:10:34 localhost podman[322365]: 2025-10-14 10:10:34.905283694 +0000 UTC m=+0.231738501 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:10:34 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:10:34 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:10:36 localhost nova_compute[297686]: 2025-10-14 10:10:36.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:36 localhost nova_compute[297686]: 2025-10-14 10:10:36.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:38 localhost openstack_network_exporter[250374]: ERROR 10:10:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:10:38 localhost openstack_network_exporter[250374]: ERROR 10:10:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:10:38 localhost openstack_network_exporter[250374]: ERROR 10:10:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:10:38 localhost openstack_network_exporter[250374]: ERROR 10:10:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:10:38 localhost openstack_network_exporter[250374]: Oct 14 06:10:38 localhost openstack_network_exporter[250374]: ERROR 10:10:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:10:38 localhost openstack_network_exporter[250374]: Oct 14 06:10:39 localhost nova_compute[297686]: 2025-10-14 10:10:39.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:40 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:40.033 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:39Z, description=, device_id=4b1079c3-2a61-46a2-aa77-b2bf94fe0111, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=afc220f0-fbbc-4e63-bc40-ba28bc3afd43, ip_allocation=immediate, mac_address=fa:16:3e:ee:5a:f4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=247, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:10:39Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:10:40 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:10:40 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:10:40 localhost systemd[1]: tmp-crun.tvony4.mount: Deactivated successfully. Oct 14 06:10:40 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:10:40 localhost podman[322439]: 2025-10-14 10:10:40.260925197 +0000 UTC m=+0.067904317 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:10:40 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:40.501 271987 INFO neutron.agent.dhcp.agent [None req-d2d8f87f-fcb5-4ed0-a225-d6e9d23e92a6 - - - - - -] DHCP configuration for ports {'afc220f0-fbbc-4e63-bc40-ba28bc3afd43'} is completed#033[00m Oct 14 06:10:41 localhost nova_compute[297686]: 2025-10-14 10:10:41.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:41 localhost nova_compute[297686]: 2025-10-14 10:10:41.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:42.797 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:42Z, description=, device_id=ce8920b2-dd67-4ff0-bb92-8d428de8525a, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6c87a1c8-1378-40bc-b1da-3d746c07ba5e, ip_allocation=immediate, mac_address=fa:16:3e:96:f4:fa, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=271, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:10:42Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:10:43 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 5 addresses Oct 14 06:10:43 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:10:43 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:10:43 localhost podman[322476]: 2025-10-14 10:10:43.046325852 +0000 UTC m=+0.061269803 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:10:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:43.320 271987 INFO neutron.agent.dhcp.agent [None req-0b96b0d6-fcc0-459a-ad30-e0cd15eec0d1 - - - - - -] DHCP configuration for ports {'6c87a1c8-1378-40bc-b1da-3d746c07ba5e'} is completed#033[00m Oct 14 06:10:43 localhost nova_compute[297686]: 2025-10-14 10:10:43.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:44 localhost nova_compute[297686]: 2025-10-14 10:10:44.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:45 localhost nova_compute[297686]: 2025-10-14 10:10:45.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:46 localhost nova_compute[297686]: 2025-10-14 10:10:46.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:47 localhost neutron_sriov_agent[264974]: 2025-10-14 10:10:47.603 2 INFO neutron.agent.securitygroups_rpc [None req-a54e3f06-187e-448e-927e-770f804e5356 4a2c72478a7c4747a73158cd8119b6ba d6e7f435b24646ecaa54e485b818329f - - default default] Security group member updated ['08e02d40-7eb0-493a-bf38-79869188d51f']#033[00m Oct 14 06:10:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:10:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:10:47 localhost podman[322499]: 2025-10-14 10:10:47.751427429 +0000 UTC m=+0.089835371 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:10:47 localhost systemd[1]: tmp-crun.mJcSMO.mount: Deactivated successfully. Oct 14 06:10:47 localhost podman[322500]: 2025-10-14 10:10:47.811861465 +0000 UTC m=+0.145696467 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:10:47 localhost podman[322499]: 2025-10-14 10:10:47.816217239 +0000 UTC m=+0.154625231 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:10:47 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:10:47 localhost podman[322500]: 2025-10-14 10:10:47.84522592 +0000 UTC m=+0.179060922 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Oct 14 06:10:47 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:10:48 localhost nova_compute[297686]: 2025-10-14 10:10:48.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:10:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2727635995' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:10:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:10:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2727635995' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:10:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:49 localhost nova_compute[297686]: 2025-10-14 10:10:49.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.819 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.843 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.844 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8163bc6f-46e7-4005-9fd6-ac1689f7a9fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.820767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f23f254-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': 'bfb0892371454d7a0d1b2d18f8eabd0c3995d1511fc8749121c0c4b1aaebf024'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.820767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f240a1e-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': 'd614552296a54130e1376d3d8af01abc11bd6d0860d3b33230f70dc6ed2d691b'}]}, 'timestamp': '2025-10-14 10:10:49.845223', '_unique_id': 'aa026dabc8b44bc0a9e70c3e174727d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.846 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.852 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23ba679f-f71d-4cc9-bad9-a86f87058a8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.848463', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f253d3a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '7e51106011e2b236ba755eb14b2f6a52b5ed3778a9b3620ec5e93f29d0f053b3'}]}, 'timestamp': '2025-10-14 10:10:49.853104', '_unique_id': 'f4ea945153794e44b6ced06879148908'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.854 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.855 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.872 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c856e5a-6f66-40b0-9dba-3aded7a82ce8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:10:49.855767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0f283706-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.064686428, 'message_signature': '8d96ed4a79c4356d70931a4ac881731f47043e5aa92ff88b770d6e2a720cd63b'}]}, 'timestamp': '2025-10-14 10:10:49.872589', '_unique_id': '6e96bf159a8f4e32a5467ef09a47dfb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.873 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.874 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.875 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 14990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d42e873-a45a-4d1b-90b9-639e4a2e6621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14990000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:10:49.875115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0f28ac04-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.064686428, 'message_signature': 'eca62dbe2a90b1e7472029c9b735279357f024e7c5c1410d6951fe8164199041'}]}, 'timestamp': '2025-10-14 10:10:49.875627', '_unique_id': '3a73c675db974ede913734d542b563ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.877 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.878 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fae0e5f8-e3ee-4f71-a74f-d474ac70fffe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.877841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f2915ea-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '9730727c8ee00649be06ef4b398d87e9712fc7f1484a7be7b2b58d5efb10efc2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.877841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f292634-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '8fcccb303e761aefba883a85c5897741dcbf442be4efbf2ac2c81420338d69e4'}]}, 'timestamp': '2025-10-14 10:10:49.878718', '_unique_id': '6785544719984979b53914dae82dbb6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.879 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.880 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.892 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.892 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f610c0-13de-4c01-89d0-33e190b15849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.880936', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f2b45a4-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.073640873, 'message_signature': '1217f1a82f9a7ce7fdc661dfad7d5b3b1eaa18a9e2d5ca709b588c283f0141a7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.880936', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f2b5882-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.073640873, 'message_signature': 'ad2ce0d67517ef464491343030b6346c2005fbdc6a70d3b73af9283794b5b779'}]}, 'timestamp': '2025-10-14 10:10:49.893089', '_unique_id': 'a5f7d6046f594b528a189535c119d431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.895 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aef71d5e-2c6b-4453-96cd-91135d7cc69b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.895467', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2bc8a8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '4b31481fb3a75666ca78ef2ced99ba3e0ecd1dd2dbc6e82a66c43271c1ac4864'}]}, 'timestamp': '2025-10-14 10:10:49.896036', '_unique_id': '5dbb2986abad4d29923d9f682ce898cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.897 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.898 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.898 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.898 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32ed23fe-f665-4650-9643-a84c4f99cb73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.898328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f2c37d4-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.073640873, 'message_signature': 'a1b784a280a68cc131f690816b9e4f48263d7556a1b1af0e84c9e211bc1bbc7b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.898328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f2c49d6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.073640873, 'message_signature': '98a676a325874e0c6a6e001f61cd6f7405aba7a496e62b298b443879deba755e'}]}, 'timestamp': '2025-10-14 10:10:49.899260', '_unique_id': '8c11c33007774a5f88c4412d16fd3e71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.901 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.901 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b660f55-33a6-4f08-8fce-e1914f33af82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.901496', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2cb358-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': 'a4a3a95f717ce7d7708cdaa136679ab7a66c79447885ee69db935936e4431913'}]}, 'timestamp': '2025-10-14 10:10:49.901993', '_unique_id': 'a53b1608719a47e08133829fabae0f30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a84aff5-12b0-4a06-8b89-fbf298bc0dad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.904274', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2d1e60-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '630ac8183063f4ba367a4ee293a0c2cbc0ac393ace4b48360b06c1b4cfea3671'}]}, 'timestamp': '2025-10-14 10:10:49.904767', '_unique_id': '79868714f7094146b0265d6d6c3cca32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdda24a3-38cf-4ab1-9cda-5a33a57f68fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.906919', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2d85da-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '71363e39cd171e0fe7e6a206b2503d727acfa398be1d9484be73951bbe0ac51e'}]}, 'timestamp': '2025-10-14 10:10:49.907382', '_unique_id': 'b992672d845f4cf185d36de6cb38e4ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4450f5b-362b-4b79-8465-a52fc11a8aee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.909616', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2df074-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '54eb51296a8ea743423cc33563845c8aeb3e30a4d4c1dfe57df260ba039d2760'}]}, 'timestamp': '2025-10-14 10:10:49.910111', '_unique_id': '9f4d0ff8a62c456185c5a89b55e80b62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b062fef0-7083-4e9f-bb41-47f95fb3ab10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.912326', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2e5960-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': 'c2cbe174ac6ad0aabcf1daed6e1328a5c719814c501d16e8bc604f1b9f79dc24'}]}, 'timestamp': '2025-10-14 10:10:49.912837', '_unique_id': 'd341294530b745ba8ac07fd2491dac16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7d10690-697b-4108-b4ca-f1b9610f74c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.914983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f2ec102-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '04dbec9c75bd2bf285ecaa5d2b033d3b4d681641d712326ebc4adcc9dc844eee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.914983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f2ed1a6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '18c16e24d599672f2ca11aafb7619e861e2e9bd9f7d1c77c2eecb3b61dc03311'}]}, 'timestamp': '2025-10-14 10:10:49.915906', '_unique_id': 'afbadc1af7054a2ea129a334352be7b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.917 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd350be9-8547-46cf-a164-91fa7ab1b7a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.917586', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f2f24c6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '06719ded66ecd524e5cdb692332901027c6d58c1fbc0b3bd371ada4c8d092a04'}]}, 'timestamp': '2025-10-14 10:10:49.917921', '_unique_id': '8322cbfe65ac4c7da1ae0ec976a8cf92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.918 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.919 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.919 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a29b7e87-65a0-416b-afa0-289656b31295', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.919255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f2f6486-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.073640873, 'message_signature': '025c3349027ee846368db457c6c4d6cea9e4589e257c9427a6c16af5c45f013d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.919255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f2f6eb8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.073640873, 'message_signature': '692e7f85079cc9b707a8e589e2afcd8962a328e0f0c1f0f3f10a2799aa456007'}]}, 'timestamp': '2025-10-14 10:10:49.919813', '_unique_id': 'cad1af2ed6d640df81724b0be7a00737'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca179dd9-20d5-4751-9c5f-ff8ad316e216', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.921279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f2fb350-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '45124924429c04a50c2a903574ef820a51a6bc82cb92cd004630ee7138f19c43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.921279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f2fbe40-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': 'd72efae2a72c23c73ecb307909c3dfb62affcd45878d1f585f573e3e0f519c4e'}]}, 'timestamp': '2025-10-14 10:10:49.921829', '_unique_id': 'c68d930e4c1a428abf98611ae80e6afe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.922 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f285f6b2-0429-470f-9665-8258f941173d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.923480', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f300990-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '85e2840e65709aa46b2ed0a15e768961caed5c1cd79c4acdcfd34e49d4a73d62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.923480', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f30158e-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '9fc818ca75b123e59e867f5c3826789418eb8cc9e69dbf72f42f8037036841cb'}]}, 'timestamp': '2025-10-14 10:10:49.924113', '_unique_id': '60c5f0b462c94b818f824fe17e5ff82b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.924 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fee9f8a2-0a0d-4b6b-af0c-7ecc6a2a3bf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:10:49.925472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f305760-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': 'bbce93015b036bacad104eb24097009a0d2af9839b9c09f91799daa47bb8f18d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:10:49.925472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f306566-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.013472254, 'message_signature': '9d50a7edf771af9b38c1e51ba7ce266e0c3b8a03eb921c8091c6e5246f2f0504'}]}, 'timestamp': '2025-10-14 10:10:49.926109', '_unique_id': '9f47815bcdcc4febbaaa242ccae0ce00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.927 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2db4486-3362-4bc3-8bb0-704a9819a0de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.927539', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f30a8f0-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '5f7ad52a293775e556323a35341cda22d3a16767b548cb3850ca7dee4059c0db'}]}, 'timestamp': '2025-10-14 10:10:49.927859', '_unique_id': '77d8baf84fff43709eeddd61b32c6998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e977f74-a84c-4fcd-ae02-3084291e0112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:10:49.929303', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '0f30ecd4-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12666.041180395, 'message_signature': '3739a2cda52c3de66d8aa7af78a32ba2ea4e22057ddb8ec2950bcb2fedec8764'}]}, 'timestamp': '2025-10-14 10:10:49.929591', '_unique_id': 'ecfea2d551a24036b23c97dfe64cc28e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:10:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:10:49.930 12 ERROR oslo_messaging.notify.messaging Oct 14 06:10:50 localhost neutron_sriov_agent[264974]: 2025-10-14 10:10:50.755 2 INFO neutron.agent.securitygroups_rpc [None req-824c6cec-0c8a-4d5b-950c-077e64945e6c d6d06f9c969f4b25a388e6b1f8e79df2 4a912863089b4050b50010417538a2b4 - - default default] Security group member updated ['f4a71cc4-401e-4fd9-a76d-664285c1f988']#033[00m Oct 14 06:10:51 localhost neutron_sriov_agent[264974]: 2025-10-14 10:10:51.543 2 INFO neutron.agent.securitygroups_rpc [None req-1fdb7ab6-e039-42d0-a00d-20226c0980d9 4a2c72478a7c4747a73158cd8119b6ba d6e7f435b24646ecaa54e485b818329f - - default default] Security group member updated ['08e02d40-7eb0-493a-bf38-79869188d51f']#033[00m Oct 14 06:10:51 localhost nova_compute[297686]: 2025-10-14 10:10:51.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:54 localhost nova_compute[297686]: 2025-10-14 10:10:54.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:10:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:10:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:10:54 localhost podman[322539]: 2025-10-14 10:10:54.741121387 +0000 UTC m=+0.080032830 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Oct 14 06:10:54 localhost podman[322539]: 2025-10-14 10:10:54.756078076 +0000 UTC m=+0.094989509 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Oct 14 06:10:54 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:10:54 localhost podman[322541]: 2025-10-14 10:10:54.801704447 +0000 UTC m=+0.132334546 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:10:54 localhost podman[322541]: 2025-10-14 10:10:54.837338772 +0000 UTC m=+0.167968851 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3) Oct 14 06:10:54 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:10:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:10:54.849 2 INFO neutron.agent.securitygroups_rpc [None req-913b626a-cd3c-47c0-b3fd-4256ea7d0f27 d6d06f9c969f4b25a388e6b1f8e79df2 4a912863089b4050b50010417538a2b4 - - default default] Security group member updated ['f4a71cc4-401e-4fd9-a76d-664285c1f988']#033[00m Oct 14 06:10:54 localhost podman[322540]: 2025-10-14 10:10:54.838092505 +0000 UTC m=+0.173736408 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:10:54 localhost podman[322540]: 2025-10-14 10:10:54.91931496 +0000 UTC m=+0.254958923 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:10:54 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:10:56 localhost nova_compute[297686]: 2025-10-14 10:10:56.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:57.780 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:10:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:57.781 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:10:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:57.781 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:10:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:57.836 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:57Z, description=, device_id=441b9482-507a-4462-bd9c-cff58309f1a4, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=823a1e2e-ea05-4cb4-b006-8e12a7b081f2, ip_allocation=immediate, mac_address=fa:16:3e:8a:44:fc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=361, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:10:57Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:10:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:57.962 271987 INFO neutron.agent.linux.ip_lib [None req-c49495b3-08bb-42e7-964d-6947f30e6770 - - - - - -] Device tapdc3b8dd3-96 cannot be used as it has no MAC address#033[00m Oct 14 06:10:58 localhost nova_compute[297686]: 2025-10-14 10:10:58.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:58 localhost kernel: device tapdc3b8dd3-96 entered promiscuous mode Oct 14 06:10:58 localhost nova_compute[297686]: 2025-10-14 10:10:58.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:58 localhost systemd-udevd[322640]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:10:58 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:10:58 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:10:58 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:10:58 localhost ovn_controller[157396]: 2025-10-14T10:10:58Z|00067|binding|INFO|Claiming lport dc3b8dd3-960d-42ca-9beb-c95bf3287cde for this chassis. Oct 14 06:10:58 localhost ovn_controller[157396]: 2025-10-14T10:10:58Z|00068|binding|INFO|dc3b8dd3-960d-42ca-9beb-c95bf3287cde: Claiming unknown Oct 14 06:10:58 localhost NetworkManager[5977]: [1760436658.0542] manager: (tapdc3b8dd3-96): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Oct 14 06:10:58 localhost podman[322626]: 2025-10-14 10:10:58.048026122 +0000 UTC m=+0.062203442 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Oct 14 06:10:58 localhost ovn_controller[157396]: 2025-10-14T10:10:58Z|00069|binding|INFO|Setting lport dc3b8dd3-960d-42ca-9beb-c95bf3287cde ovn-installed in OVS Oct 14 06:10:58 localhost ovn_controller[157396]: 2025-10-14T10:10:58Z|00070|binding|INFO|Setting lport dc3b8dd3-960d-42ca-9beb-c95bf3287cde up in Southbound Oct 14 06:10:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:58.063 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-0e5431a1-78f9-434e-8ef9-801516788dc4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e5431a1-78f9-434e-8ef9-801516788dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '675ac6c145fa461ebd6a2d48e6cc697b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd1e5439-1a39-4e65-bfc8-95441e65eaf8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dc3b8dd3-960d-42ca-9beb-c95bf3287cde) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:10:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:58.065 163055 INFO neutron.agent.ovn.metadata.agent [-] Port dc3b8dd3-960d-42ca-9beb-c95bf3287cde in datapath 0e5431a1-78f9-434e-8ef9-801516788dc4 bound to our chassis#033[00m Oct 14 06:10:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:58.069 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 55323669-86d5-4360-8c8e-e4e99d5c0c24 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:10:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:58.069 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e5431a1-78f9-434e-8ef9-801516788dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:10:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:10:58.071 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c66c6b-dbdb-41e0-8350-af42f491559e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:10:58 localhost nova_compute[297686]: 2025-10-14 10:10:58.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:58 localhost nova_compute[297686]: 2025-10-14 10:10:58.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:58 localhost nova_compute[297686]: 2025-10-14 10:10:58.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:58 localhost nova_compute[297686]: 2025-10-14 10:10:58.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:58 localhost podman[248187]: time="2025-10-14T10:10:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:10:58 localhost podman[248187]: @ - - [14/Oct/2025:10:10:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:10:58 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:58.343 271987 INFO neutron.agent.dhcp.agent [None req-41d09a28-15e0-4871-84c3-b21021ce6441 - - - - - -] DHCP configuration for ports {'823a1e2e-ea05-4cb4-b006-8e12a7b081f2'} is completed#033[00m Oct 14 06:10:58 localhost podman[248187]: @ - - [14/Oct/2025:10:10:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19858 "" "Go-http-client/1.1" Oct 14 06:10:58 localhost podman[322704]: Oct 14 06:10:58 localhost podman[322704]: 2025-10-14 10:10:58.994892919 +0000 UTC m=+0.076162520 container create 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:10:59 localhost systemd[1]: Started libpod-conmon-173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678.scope. Oct 14 06:10:59 localhost systemd[1]: Started libcrun container. Oct 14 06:10:59 localhost podman[322704]: 2025-10-14 10:10:58.960493593 +0000 UTC m=+0.041763174 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:10:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/747b431766382947c8d7c65b21fb42c1256bfdecb9c98504d6ef52da38531de9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:10:59 localhost podman[322704]: 2025-10-14 10:10:59.071628427 +0000 UTC m=+0.152898028 container init 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 06:10:59 localhost podman[322704]: 2025-10-14 10:10:59.081109348 +0000 UTC m=+0.162378909 container start 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:10:59 localhost dnsmasq[322722]: started, version 2.85 cachesize 150 Oct 14 06:10:59 localhost dnsmasq[322722]: DNS service limited to local subnets Oct 14 06:10:59 localhost dnsmasq[322722]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:10:59 localhost dnsmasq[322722]: warning: no upstream servers configured Oct 14 06:10:59 localhost dnsmasq-dhcp[322722]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:10:59 localhost dnsmasq[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/addn_hosts - 0 addresses Oct 14 06:10:59 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/host Oct 14 06:10:59 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/opts Oct 14 06:10:59 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:10:59.236 271987 INFO neutron.agent.dhcp.agent [None req-d7350957-5046-45c1-974c-b17e962c43ff - - - - - -] DHCP configuration for ports {'8cd54cdf-8251-4e03-bc14-b8633e33d946'} is completed#033[00m Oct 14 06:10:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:10:59 localhost nova_compute[297686]: 2025-10-14 10:10:59.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:10:59 localhost nova_compute[297686]: 2025-10-14 10:10:59.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:00.449 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:59Z, description=, device_id=441b9482-507a-4462-bd9c-cff58309f1a4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0fae7ba5-dddb-4bbf-9c27-de4582d951bd, ip_allocation=immediate, mac_address=fa:16:3e:c9:26:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:10:55Z, description=, dns_domain=, id=0e5431a1-78f9-434e-8ef9-801516788dc4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-66249174-network, port_security_enabled=True, project_id=675ac6c145fa461ebd6a2d48e6cc697b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54998, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=348, status=ACTIVE, subnets=['3b4b8ebd-6a77-4859-80f8-5e7763b1fc45'], tags=[], tenant_id=675ac6c145fa461ebd6a2d48e6cc697b, updated_at=2025-10-14T10:10:56Z, vlan_transparent=None, network_id=0e5431a1-78f9-434e-8ef9-801516788dc4, port_security_enabled=False, project_id=675ac6c145fa461ebd6a2d48e6cc697b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=382, status=DOWN, tags=[], tenant_id=675ac6c145fa461ebd6a2d48e6cc697b, updated_at=2025-10-14T10:11:00Z on network 0e5431a1-78f9-434e-8ef9-801516788dc4#033[00m Oct 14 06:11:00 localhost dnsmasq[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/addn_hosts - 1 addresses Oct 14 06:11:00 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/host Oct 14 06:11:00 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/opts Oct 14 06:11:00 localhost podman[322739]: 2025-10-14 10:11:00.662764886 +0000 UTC m=+0.060589603 container kill 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:11:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:00.874 271987 INFO neutron.agent.dhcp.agent [None req-1b293df0-6131-4356-aa63-61a808c6032e - - - - - -] DHCP configuration for ports {'0fae7ba5-dddb-4bbf-9c27-de4582d951bd'} is completed#033[00m Oct 14 06:11:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:01.935 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:59Z, description=, device_id=441b9482-507a-4462-bd9c-cff58309f1a4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0fae7ba5-dddb-4bbf-9c27-de4582d951bd, ip_allocation=immediate, mac_address=fa:16:3e:c9:26:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:10:55Z, description=, dns_domain=, id=0e5431a1-78f9-434e-8ef9-801516788dc4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-66249174-network, port_security_enabled=True, project_id=675ac6c145fa461ebd6a2d48e6cc697b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54998, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=348, status=ACTIVE, subnets=['3b4b8ebd-6a77-4859-80f8-5e7763b1fc45'], tags=[], tenant_id=675ac6c145fa461ebd6a2d48e6cc697b, updated_at=2025-10-14T10:10:56Z, vlan_transparent=None, network_id=0e5431a1-78f9-434e-8ef9-801516788dc4, port_security_enabled=False, project_id=675ac6c145fa461ebd6a2d48e6cc697b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=382, status=DOWN, tags=[], tenant_id=675ac6c145fa461ebd6a2d48e6cc697b, updated_at=2025-10-14T10:11:00Z on network 0e5431a1-78f9-434e-8ef9-801516788dc4#033[00m Oct 14 06:11:01 localhost nova_compute[297686]: 2025-10-14 10:11:01.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:02 localhost dnsmasq[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/addn_hosts - 1 addresses Oct 14 06:11:02 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/host Oct 14 06:11:02 localhost systemd[1]: tmp-crun.tScs2J.mount: Deactivated successfully. Oct 14 06:11:02 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/opts Oct 14 06:11:02 localhost podman[322777]: 2025-10-14 10:11:02.122926441 +0000 UTC m=+0.050939096 container kill 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Oct 14 06:11:02 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:02.316 271987 INFO neutron.agent.dhcp.agent [None req-a7ffd733-e321-4352-a7bd-9234e57e1f5d - - - - - -] DHCP configuration for ports {'0fae7ba5-dddb-4bbf-9c27-de4582d951bd'} is completed#033[00m Oct 14 06:11:04 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Oct 14 06:11:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:04 localhost nova_compute[297686]: 2025-10-14 10:11:04.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:11:05 localhost systemd[1]: tmp-crun.qNGJ1x.mount: Deactivated successfully. Oct 14 06:11:05 localhost podman[322800]: 2025-10-14 10:11:05.746130443 +0000 UTC m=+0.079784131 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:11:05 localhost podman[322800]: 2025-10-14 10:11:05.754763368 +0000 UTC m=+0.088417066 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7) Oct 14 06:11:05 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:11:05 localhost podman[322801]: 2025-10-14 10:11:05.805894699 +0000 UTC m=+0.135977958 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:11:05 localhost podman[322801]: 2025-10-14 10:11:05.844084712 +0000 UTC m=+0.174167961 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Oct 14 06:11:05 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:11:05 localhost podman[322799]: 2025-10-14 10:11:05.864293763 +0000 UTC m=+0.201111659 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:11:05 localhost podman[322799]: 2025-10-14 10:11:05.903134306 +0000 UTC m=+0.239952182 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 06:11:05 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:11:06 localhost dnsmasq[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/addn_hosts - 0 addresses Oct 14 06:11:06 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/host Oct 14 06:11:06 localhost podman[322880]: 2025-10-14 10:11:06.381167021 +0000 UTC m=+0.064066419 container kill 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:11:06 localhost dnsmasq-dhcp[322722]: read /var/lib/neutron/dhcp/0e5431a1-78f9-434e-8ef9-801516788dc4/opts Oct 14 06:11:06 localhost nova_compute[297686]: 2025-10-14 10:11:06.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:06 localhost ovn_controller[157396]: 2025-10-14T10:11:06Z|00071|binding|INFO|Releasing lport dc3b8dd3-960d-42ca-9beb-c95bf3287cde from this chassis (sb_readonly=0) Oct 14 06:11:06 localhost kernel: device tapdc3b8dd3-96 left promiscuous mode Oct 14 06:11:06 localhost ovn_controller[157396]: 2025-10-14T10:11:06Z|00072|binding|INFO|Setting lport dc3b8dd3-960d-42ca-9beb-c95bf3287cde down in Southbound Oct 14 06:11:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:06.546 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-0e5431a1-78f9-434e-8ef9-801516788dc4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e5431a1-78f9-434e-8ef9-801516788dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '675ac6c145fa461ebd6a2d48e6cc697b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd1e5439-1a39-4e65-bfc8-95441e65eaf8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dc3b8dd3-960d-42ca-9beb-c95bf3287cde) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:06.547 163055 INFO neutron.agent.ovn.metadata.agent [-] Port dc3b8dd3-960d-42ca-9beb-c95bf3287cde in datapath 0e5431a1-78f9-434e-8ef9-801516788dc4 unbound from our chassis#033[00m Oct 14 06:11:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:06.550 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e5431a1-78f9-434e-8ef9-801516788dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:11:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:06.551 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[12a9dab5-6c16-4786-9028-6a6b9216ec84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:06 localhost nova_compute[297686]: 2025-10-14 10:11:06.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:06 localhost systemd[1]: tmp-crun.gon5Um.mount: Deactivated successfully. Oct 14 06:11:07 localhost nova_compute[297686]: 2025-10-14 10:11:07.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:08 localhost openstack_network_exporter[250374]: ERROR 10:11:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:11:08 localhost openstack_network_exporter[250374]: ERROR 10:11:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:11:08 localhost openstack_network_exporter[250374]: ERROR 10:11:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:11:08 localhost openstack_network_exporter[250374]: ERROR 10:11:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:11:08 localhost openstack_network_exporter[250374]: Oct 14 06:11:08 localhost openstack_network_exporter[250374]: ERROR 10:11:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:11:08 localhost openstack_network_exporter[250374]: Oct 14 06:11:08 localhost ovn_controller[157396]: 2025-10-14T10:11:08Z|00073|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:08 localhost systemd[1]: tmp-crun.ep2gnE.mount: Deactivated successfully. Oct 14 06:11:08 localhost podman[322919]: 2025-10-14 10:11:08.92803788 +0000 UTC m=+0.048244903 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:11:08 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 5 addresses Oct 14 06:11:08 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:08 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:08 localhost nova_compute[297686]: 2025-10-14 10:11:08.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:09 localhost nova_compute[297686]: 2025-10-14 10:11:09.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:09 localhost dnsmasq[322722]: exiting on receipt of SIGTERM Oct 14 06:11:09 localhost podman[322958]: 2025-10-14 10:11:09.714057365 +0000 UTC m=+0.048197872 container kill 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 06:11:09 localhost systemd[1]: libpod-173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678.scope: Deactivated successfully. Oct 14 06:11:09 localhost podman[322970]: 2025-10-14 10:11:09.755141077 +0000 UTC m=+0.033556692 container died 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:11:09 localhost podman[322970]: 2025-10-14 10:11:09.778740063 +0000 UTC m=+0.057155648 container cleanup 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:11:09 localhost systemd[1]: libpod-conmon-173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678.scope: Deactivated successfully. Oct 14 06:11:09 localhost podman[322977]: 2025-10-14 10:11:09.843058369 +0000 UTC m=+0.110656481 container remove 173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0e5431a1-78f9-434e-8ef9-801516788dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:11:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:09.873 271987 INFO neutron.agent.dhcp.agent [None req-9560e0a2-024d-400f-98dc-562bd88e9267 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:11:09 localhost systemd[1]: var-lib-containers-storage-overlay-747b431766382947c8d7c65b21fb42c1256bfdecb9c98504d6ef52da38531de9-merged.mount: Deactivated successfully. Oct 14 06:11:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-173aa5dded75fe3fa83d7cc278e1951d15a95c4abee938cc9036a47356306678-userdata-shm.mount: Deactivated successfully. Oct 14 06:11:09 localhost systemd[1]: run-netns-qdhcp\x2d0e5431a1\x2d78f9\x2d434e\x2d8ef9\x2d801516788dc4.mount: Deactivated successfully. Oct 14 06:11:10 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:10.052 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:11:12 localhost nova_compute[297686]: 2025-10-14 10:11:12.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:14 localhost nova_compute[297686]: 2025-10-14 10:11:14.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:17 localhost nova_compute[297686]: 2025-10-14 10:11:17.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:18 localhost nova_compute[297686]: 2025-10-14 10:11:18.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:18.265 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:18.268 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:11:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:11:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:11:18 localhost podman[323016]: 2025-10-14 10:11:18.664851216 +0000 UTC m=+0.090560363 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:11:18 localhost podman[323016]: 2025-10-14 10:11:18.672833511 +0000 UTC m=+0.098542648 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:11:18 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:11:18 localhost podman[323017]: 2025-10-14 10:11:18.724641923 +0000 UTC m=+0.145345746 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Oct 14 06:11:18 localhost podman[323017]: 2025-10-14 10:11:18.759105301 +0000 UTC m=+0.179809084 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:18 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:11:19 localhost nova_compute[297686]: 2025-10-14 10:11:19.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:19 localhost nova_compute[297686]: 2025-10-14 10:11:19.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 06:11:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:19 localhost nova_compute[297686]: 2025-10-14 10:11:19.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:20.012 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:19Z, description=, device_id=c0b41014-4438-4c04-87db-17aca2fa351b, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=243429a1-371e-4481-a50d-e94b4345b000, ip_allocation=immediate, mac_address=fa:16:3e:ad:02:f1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=477, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:11:19Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:11:20 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:11:20 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:20 localhost podman[323142]: 2025-10-14 10:11:20.224859038 +0000 UTC m=+0.046953023 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:11:20 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:11:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:11:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:20.364 271987 INFO neutron.agent.linux.ip_lib [None req-c7ac5375-bb46-41f7-9e20-d62530079a14 - - - - - -] Device tapd04cf974-c1 cannot be used as it has no MAC address#033[00m Oct 14 06:11:20 localhost nova_compute[297686]: 2025-10-14 10:11:20.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:20 localhost kernel: device tapd04cf974-c1 entered promiscuous mode Oct 14 06:11:20 localhost ovn_controller[157396]: 2025-10-14T10:11:20Z|00074|binding|INFO|Claiming lport d04cf974-c1dc-44ed-a9a5-2e9c296629d5 for this chassis. Oct 14 06:11:20 localhost ovn_controller[157396]: 2025-10-14T10:11:20Z|00075|binding|INFO|d04cf974-c1dc-44ed-a9a5-2e9c296629d5: Claiming unknown Oct 14 06:11:20 localhost NetworkManager[5977]: [1760436680.4549] manager: (tapd04cf974-c1): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Oct 14 06:11:20 localhost nova_compute[297686]: 2025-10-14 10:11:20.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:20 localhost systemd-udevd[323174]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:11:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:20.470 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-f1f43916-0af0-4af9-8f19-f6fc2753bec8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1f43916-0af0-4af9-8f19-f6fc2753bec8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9044d9d601704eeba0d285167c4f4d39', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e99faa9c-58fc-4480-8d7e-2c721382ff66, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d04cf974-c1dc-44ed-a9a5-2e9c296629d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:20.471 163055 INFO neutron.agent.ovn.metadata.agent [-] Port d04cf974-c1dc-44ed-a9a5-2e9c296629d5 in datapath f1f43916-0af0-4af9-8f19-f6fc2753bec8 bound to our chassis#033[00m Oct 14 06:11:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:20.478 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port fba467bd-2c97-4700-b3b1-ada7ea10d5e0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:11:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:20.478 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1f43916-0af0-4af9-8f19-f6fc2753bec8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:11:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:20.480 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[697b70c4-df6d-40a2-903d-c50b13d769d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:20.491 271987 INFO neutron.agent.dhcp.agent [None req-a86f76cb-6e04-4318-8717-36001b0afc7c - - - - - -] DHCP configuration for ports {'243429a1-371e-4481-a50d-e94b4345b000'} is completed#033[00m Oct 14 06:11:20 localhost ovn_controller[157396]: 2025-10-14T10:11:20Z|00076|binding|INFO|Setting lport d04cf974-c1dc-44ed-a9a5-2e9c296629d5 ovn-installed in OVS Oct 14 06:11:20 localhost ovn_controller[157396]: 2025-10-14T10:11:20Z|00077|binding|INFO|Setting lport d04cf974-c1dc-44ed-a9a5-2e9c296629d5 up in Southbound Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost nova_compute[297686]: 2025-10-14 10:11:20.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost journal[237477]: ethtool ioctl error on tapd04cf974-c1: No such device Oct 14 06:11:20 localhost nova_compute[297686]: 2025-10-14 10:11:20.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:20 localhost nova_compute[297686]: 2025-10-14 10:11:20.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:20 localhost nova_compute[297686]: 2025-10-14 10:11:20.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:21 localhost nova_compute[297686]: 2025-10-14 10:11:21.266 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:21 localhost nova_compute[297686]: 2025-10-14 10:11:21.266 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:21 localhost nova_compute[297686]: 2025-10-14 10:11:21.266 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:21 localhost nova_compute[297686]: 2025-10-14 10:11:21.267 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 06:11:21 localhost nova_compute[297686]: 2025-10-14 10:11:21.376 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 06:11:21 localhost podman[323246]: Oct 14 06:11:21 localhost podman[323246]: 2025-10-14 10:11:21.424883333 +0000 UTC m=+0.092783522 container create f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:11:21 localhost systemd[1]: Started libpod-conmon-f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b.scope. Oct 14 06:11:21 localhost podman[323246]: 2025-10-14 10:11:21.380972083 +0000 UTC m=+0.048872312 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:11:21 localhost systemd[1]: Started libcrun container. Oct 14 06:11:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84fa18b9115ffa7fbc3a0ebcc4ddc79bc8ef5ed35430398b56fb2af8f7f9c889/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:11:21 localhost podman[323246]: 2025-10-14 10:11:21.50845543 +0000 UTC m=+0.176355629 container init f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:11:21 localhost podman[323246]: 2025-10-14 10:11:21.51857335 +0000 UTC m=+0.186473549 container start f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:11:21 localhost dnsmasq[323264]: started, version 2.85 cachesize 150 Oct 14 06:11:21 localhost dnsmasq[323264]: DNS service limited to local subnets Oct 14 06:11:21 localhost dnsmasq[323264]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:11:21 localhost dnsmasq[323264]: warning: no upstream servers configured Oct 14 06:11:21 localhost dnsmasq-dhcp[323264]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:11:21 localhost dnsmasq[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/addn_hosts - 0 addresses Oct 14 06:11:21 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/host Oct 14 06:11:21 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/opts Oct 14 06:11:21 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:21.634 271987 INFO neutron.agent.dhcp.agent [None req-a22ff6f3-5180-4655-acba-a0622aadc28e - - - - - -] DHCP configuration for ports {'a1312e85-be4f-4d42-acf8-f8e545f2ac9f'} is completed#033[00m Oct 14 06:11:22 localhost nova_compute[297686]: 2025-10-14 10:11:22.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:22.429 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:21Z, description=, device_id=c0b41014-4438-4c04-87db-17aca2fa351b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8a21048e-541e-4051-8bbd-c9ff64b7a0c7, ip_allocation=immediate, mac_address=fa:16:3e:4c:bc:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:11:17Z, description=, dns_domain=, id=f1f43916-0af0-4af9-8f19-f6fc2753bec8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-802293793-network, port_security_enabled=True, project_id=9044d9d601704eeba0d285167c4f4d39, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21810, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=455, status=ACTIVE, subnets=['adabb10c-5c8a-4de7-aa9b-b2fc11cbbad5'], tags=[], tenant_id=9044d9d601704eeba0d285167c4f4d39, updated_at=2025-10-14T10:11:18Z, vlan_transparent=None, network_id=f1f43916-0af0-4af9-8f19-f6fc2753bec8, port_security_enabled=False, project_id=9044d9d601704eeba0d285167c4f4d39, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=483, status=DOWN, tags=[], tenant_id=9044d9d601704eeba0d285167c4f4d39, updated_at=2025-10-14T10:11:22Z on network f1f43916-0af0-4af9-8f19-f6fc2753bec8#033[00m Oct 14 06:11:22 localhost dnsmasq[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/addn_hosts - 1 addresses Oct 14 06:11:22 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/host Oct 14 06:11:22 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/opts Oct 14 06:11:22 localhost podman[323282]: 2025-10-14 10:11:22.677352717 +0000 UTC m=+0.063825791 container kill f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:11:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:22.899 271987 INFO neutron.agent.dhcp.agent [None req-714db0bb-a08f-47db-bd5c-ae3657e6204d - - - - - -] DHCP configuration for ports {'8a21048e-541e-4051-8bbd-c9ff64b7a0c7'} is completed#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.366 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.366 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.366 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.450 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.450 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.451 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.451 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.909 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.927 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:11:23 localhost nova_compute[297686]: 2025-10-14 10:11:23.927 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:11:24 localhost nova_compute[297686]: 2025-10-14 10:11:24.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:24 localhost nova_compute[297686]: 2025-10-14 10:11:24.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:11:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:11:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:24 localhost nova_compute[297686]: 2025-10-14 10:11:24.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:24.475 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:21Z, description=, device_id=c0b41014-4438-4c04-87db-17aca2fa351b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8a21048e-541e-4051-8bbd-c9ff64b7a0c7, ip_allocation=immediate, mac_address=fa:16:3e:4c:bc:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:11:17Z, description=, dns_domain=, id=f1f43916-0af0-4af9-8f19-f6fc2753bec8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-802293793-network, port_security_enabled=True, project_id=9044d9d601704eeba0d285167c4f4d39, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21810, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=455, status=ACTIVE, subnets=['adabb10c-5c8a-4de7-aa9b-b2fc11cbbad5'], tags=[], tenant_id=9044d9d601704eeba0d285167c4f4d39, updated_at=2025-10-14T10:11:18Z, vlan_transparent=None, network_id=f1f43916-0af0-4af9-8f19-f6fc2753bec8, port_security_enabled=False, project_id=9044d9d601704eeba0d285167c4f4d39, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=483, status=DOWN, tags=[], tenant_id=9044d9d601704eeba0d285167c4f4d39, updated_at=2025-10-14T10:11:22Z on network f1f43916-0af0-4af9-8f19-f6fc2753bec8#033[00m Oct 14 06:11:24 localhost dnsmasq[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/addn_hosts - 1 addresses Oct 14 06:11:24 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/host Oct 14 06:11:24 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/opts Oct 14 06:11:24 localhost podman[323320]: 2025-10-14 10:11:24.680217825 +0000 UTC m=+0.042240509 container kill f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:11:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:24.972 271987 INFO neutron.agent.dhcp.agent [None req-41e4b4ec-1be0-4939-98e0-314a327e849a - - - - - -] DHCP configuration for ports {'8a21048e-541e-4051-8bbd-c9ff64b7a0c7'} is completed#033[00m Oct 14 06:11:25 localhost nova_compute[297686]: 2025-10-14 10:11:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:11:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:11:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:11:25 localhost systemd[1]: tmp-crun.lNxzuY.mount: Deactivated successfully. Oct 14 06:11:25 localhost podman[323343]: 2025-10-14 10:11:25.773596592 +0000 UTC m=+0.103867672 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Oct 14 06:11:25 localhost podman[323343]: 2025-10-14 10:11:25.810987731 +0000 UTC m=+0.141258821 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:11:25 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:11:25 localhost podman[323344]: 2025-10-14 10:11:25.829218401 +0000 UTC m=+0.155008852 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:11:25 localhost podman[323345]: 2025-10-14 10:11:25.867107445 +0000 UTC m=+0.192139713 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:25 localhost podman[323345]: 2025-10-14 10:11:25.877724871 +0000 UTC m=+0.202757159 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:11:25 localhost podman[323344]: 2025-10-14 10:11:25.887737909 +0000 UTC m=+0.213528400 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:11:25 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:11:25 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:11:26 localhost nova_compute[297686]: 2025-10-14 10:11:26.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:26 localhost systemd[1]: tmp-crun.nAjCKb.mount: Deactivated successfully. Oct 14 06:11:27 localhost nova_compute[297686]: 2025-10-14 10:11:27.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:27 localhost nova_compute[297686]: 2025-10-14 10:11:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:28.270 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.277 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.277 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.278 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.278 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.278 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:11:28 localhost podman[248187]: time="2025-10-14T10:11:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:11:28 localhost podman[248187]: @ - - [14/Oct/2025:10:11:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149321 "" "Go-http-client/1.1" Oct 14 06:11:28 localhost podman[248187]: @ - - [14/Oct/2025:10:11:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20334 "" "Go-http-client/1.1" Oct 14 06:11:28 localhost dnsmasq[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/addn_hosts - 0 addresses Oct 14 06:11:28 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/host Oct 14 06:11:28 localhost podman[323440]: 2025-10-14 10:11:28.565333014 +0000 UTC m=+0.043916591 container kill f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:11:28 localhost dnsmasq-dhcp[323264]: read /var/lib/neutron/dhcp/f1f43916-0af0-4af9-8f19-f6fc2753bec8/opts Oct 14 06:11:28 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:11:28 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3010111796' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.721 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:11:28 localhost kernel: device tapd04cf974-c1 left promiscuous mode Oct 14 06:11:28 localhost ovn_controller[157396]: 2025-10-14T10:11:28Z|00078|binding|INFO|Releasing lport d04cf974-c1dc-44ed-a9a5-2e9c296629d5 from this chassis (sb_readonly=0) Oct 14 06:11:28 localhost ovn_controller[157396]: 2025-10-14T10:11:28Z|00079|binding|INFO|Setting lport d04cf974-c1dc-44ed-a9a5-2e9c296629d5 down in Southbound Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:28.786 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-f1f43916-0af0-4af9-8f19-f6fc2753bec8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1f43916-0af0-4af9-8f19-f6fc2753bec8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9044d9d601704eeba0d285167c4f4d39', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e99faa9c-58fc-4480-8d7e-2c721382ff66, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d04cf974-c1dc-44ed-a9a5-2e9c296629d5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:28.788 163055 INFO neutron.agent.ovn.metadata.agent [-] Port d04cf974-c1dc-44ed-a9a5-2e9c296629d5 in datapath f1f43916-0af0-4af9-8f19-f6fc2753bec8 unbound from our chassis#033[00m Oct 14 06:11:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:28.792 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1f43916-0af0-4af9-8f19-f6fc2753bec8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:28.793 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c2437315-190b-481a-935e-60c9d0b917be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.809 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.809 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.996 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.997 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11318MB free_disk=41.43317413330078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.998 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:11:28 localhost nova_compute[297686]: 2025-10-14 10:11:28.998 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:11:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:29 localhost nova_compute[297686]: 2025-10-14 10:11:29.383 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 512, 'DISK_GB': 2}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:11:29 localhost nova_compute[297686]: 2025-10-14 10:11:29.384 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:11:29 localhost nova_compute[297686]: 2025-10-14 10:11:29.384 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:11:29 localhost nova_compute[297686]: 2025-10-14 10:11:29.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:29 localhost nova_compute[297686]: 2025-10-14 10:11:29.653 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.075 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.076 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.101 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.147 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.199 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:11:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:11:30 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1431080657' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.673 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.679 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.698 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.731 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:11:30 localhost nova_compute[297686]: 2025-10-14 10:11:30.731 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:11:31 localhost nova_compute[297686]: 2025-10-14 10:11:31.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:31 localhost nova_compute[297686]: 2025-10-14 10:11:31.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:11:32 localhost nova_compute[297686]: 2025-10-14 10:11:32.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:32 localhost systemd[1]: tmp-crun.tNgLJi.mount: Deactivated successfully. Oct 14 06:11:32 localhost podman[323506]: 2025-10-14 10:11:32.505782632 +0000 UTC m=+0.061609584 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:11:32 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 5 addresses Oct 14 06:11:32 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:32 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:33 localhost ovn_controller[157396]: 2025-10-14T10:11:33Z|00080|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:33 localhost nova_compute[297686]: 2025-10-14 10:11:33.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:33 localhost dnsmasq[323264]: exiting on receipt of SIGTERM Oct 14 06:11:33 localhost podman[323545]: 2025-10-14 10:11:33.584558941 +0000 UTC m=+0.066201704 container kill f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:11:33 localhost systemd[1]: libpod-f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b.scope: Deactivated successfully. Oct 14 06:11:33 localhost podman[323560]: 2025-10-14 10:11:33.654947444 +0000 UTC m=+0.052505714 container died f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b-userdata-shm.mount: Deactivated successfully. Oct 14 06:11:33 localhost podman[323560]: 2025-10-14 10:11:33.686440772 +0000 UTC m=+0.083999032 container cleanup f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:11:33 localhost systemd[1]: libpod-conmon-f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b.scope: Deactivated successfully. Oct 14 06:11:33 localhost podman[323561]: 2025-10-14 10:11:33.726464671 +0000 UTC m=+0.120484023 container remove f9c9f20b58b3424a48f62d277f95536e88dae3f742d2cc9ad31d5c53c340eb0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1f43916-0af0-4af9-8f19-f6fc2753bec8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:11:33 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:33.750 271987 INFO neutron.agent.dhcp.agent [None req-fa28be39-32b5-4296-8d22-e60f5722e63a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:11:33 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:33.784 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:11:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:34 localhost nova_compute[297686]: 2025-10-14 10:11:34.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:34 localhost systemd[1]: var-lib-containers-storage-overlay-84fa18b9115ffa7fbc3a0ebcc4ddc79bc8ef5ed35430398b56fb2af8f7f9c889-merged.mount: Deactivated successfully. Oct 14 06:11:34 localhost systemd[1]: run-netns-qdhcp\x2df1f43916\x2d0af0\x2d4af9\x2d8f19\x2df6fc2753bec8.mount: Deactivated successfully. Oct 14 06:11:34 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:34.687 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:34Z, description=, device_id=3ecc4fdf-a970-4380-830d-18ac6b44cf0b, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=17f467c4-795d-40b8-a578-33018a2ee51a, ip_allocation=immediate, mac_address=fa:16:3e:cc:2a:81, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=533, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:11:34Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:11:34 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:11:34 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:34 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:34 localhost podman[323604]: 2025-10-14 10:11:34.896775952 +0000 UTC m=+0.044901550 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:11:35 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:35.113 271987 INFO neutron.agent.dhcp.agent [None req-9e201f3a-991a-450d-8ee8-dd885df24ab1 - - - - - -] DHCP configuration for ports {'17f467c4-795d-40b8-a578-33018a2ee51a'} is completed#033[00m Oct 14 06:11:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e87 e87: 6 total, 6 up, 6 in Oct 14 06:11:35 localhost nova_compute[297686]: 2025-10-14 10:11:35.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:11:36 localhost systemd[1]: tmp-crun.lcwu6C.mount: Deactivated successfully. Oct 14 06:11:36 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:36.527 271987 INFO neutron.agent.linux.ip_lib [None req-9ad0b1c8-7e74-4465-b9f3-4800a4c2856c - - - - - -] Device tapa1a14cfe-e5 cannot be used as it has no MAC address#033[00m Oct 14 06:11:36 localhost podman[323630]: 2025-10-14 10:11:36.543696715 +0000 UTC m=+0.077685728 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Oct 14 06:11:36 localhost nova_compute[297686]: 2025-10-14 10:11:36.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:36 localhost kernel: device tapa1a14cfe-e5 entered promiscuous mode Oct 14 06:11:36 localhost podman[323630]: 2025-10-14 10:11:36.553448924 +0000 UTC m=+0.087437917 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Oct 14 06:11:36 localhost NetworkManager[5977]: [1760436696.5589] manager: (tapa1a14cfe-e5): new Generic device (/org/freedesktop/NetworkManager/Devices/21) Oct 14 06:11:36 localhost ovn_controller[157396]: 2025-10-14T10:11:36Z|00081|binding|INFO|Claiming lport a1a14cfe-e5bf-4bfd-a032-20e19d47a859 for this chassis. Oct 14 06:11:36 localhost ovn_controller[157396]: 2025-10-14T10:11:36Z|00082|binding|INFO|a1a14cfe-e5bf-4bfd-a032-20e19d47a859: Claiming unknown Oct 14 06:11:36 localhost nova_compute[297686]: 2025-10-14 10:11:36.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:36 localhost podman[323629]: 2025-10-14 10:11:36.563490113 +0000 UTC m=+0.093792213 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, version=9.6, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:11:36 localhost systemd-udevd[323682]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:11:36 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:11:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:36.569 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-79feba7d-100b-4e64-b831-3f4a57dbb424', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79feba7d-100b-4e64-b831-3f4a57dbb424', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '560705c462d642e4bd06d383d87d76c3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6664e9ef-d580-4f24-ad72-eb8f813819f2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a1a14cfe-e5bf-4bfd-a032-20e19d47a859) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:36.570 163055 INFO neutron.agent.ovn.metadata.agent [-] Port a1a14cfe-e5bf-4bfd-a032-20e19d47a859 in datapath 79feba7d-100b-4e64-b831-3f4a57dbb424 bound to our chassis#033[00m Oct 14 06:11:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:36.573 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port d1c0bad1-051d-473f-95fb-52d6d559d1ad IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:11:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:36.573 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79feba7d-100b-4e64-b831-3f4a57dbb424, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:11:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:36.574 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[88a4a7d0-3496-4981-99ac-cbda7dff11b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost ovn_controller[157396]: 2025-10-14T10:11:36Z|00083|binding|INFO|Setting lport a1a14cfe-e5bf-4bfd-a032-20e19d47a859 ovn-installed in OVS Oct 14 06:11:36 localhost ovn_controller[157396]: 2025-10-14T10:11:36Z|00084|binding|INFO|Setting lport a1a14cfe-e5bf-4bfd-a032-20e19d47a859 up in Southbound Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost nova_compute[297686]: 2025-10-14 10:11:36.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost podman[323629]: 2025-10-14 10:11:36.599057416 +0000 UTC m=+0.129359526 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:11:36 localhost journal[237477]: ethtool ioctl error on tapa1a14cfe-e5: No such device Oct 14 06:11:36 localhost nova_compute[297686]: 2025-10-14 10:11:36.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:36 localhost podman[323628]: 2025-10-14 10:11:36.634583736 +0000 UTC m=+0.165796744 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:11:36 localhost nova_compute[297686]: 2025-10-14 10:11:36.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:36 localhost podman[323628]: 2025-10-14 10:11:36.692032492 +0000 UTC m=+0.223245490 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller) Oct 14 06:11:36 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:11:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:37.036 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:36Z, description=, device_id=19b78374-7670-4b0d-83ca-dff512635194, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e79f26e3-191c-44ea-ba56-ba6f45e5d49c, ip_allocation=immediate, mac_address=fa:16:3e:02:6c:02, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=539, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:11:36Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:11:37 localhost nova_compute[297686]: 2025-10-14 10:11:37.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:37 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 7 addresses Oct 14 06:11:37 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:37 localhost podman[323761]: 2025-10-14 10:11:37.262778734 +0000 UTC m=+0.057203908 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS) Oct 14 06:11:37 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:37 localhost systemd[1]: tmp-crun.5H2nCu.mount: Deactivated successfully. Oct 14 06:11:37 localhost podman[323805]: Oct 14 06:11:37 localhost podman[323805]: 2025-10-14 10:11:37.534804411 +0000 UTC m=+0.088710706 container create 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 06:11:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e88 e88: 6 total, 6 up, 6 in Oct 14 06:11:37 localhost podman[323805]: 2025-10-14 10:11:37.484535837 +0000 UTC m=+0.038442212 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:11:37 localhost systemd[1]: Started libpod-conmon-1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552.scope. Oct 14 06:11:37 localhost systemd[1]: Started libcrun container. Oct 14 06:11:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeabbf162c87724632e694f0871b1c4b2b0eedcb0891391b8c90b8326e4158d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:11:37 localhost podman[323805]: 2025-10-14 10:11:37.621775713 +0000 UTC m=+0.175682018 container init 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:11:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:37.630 271987 INFO neutron.agent.dhcp.agent [None req-9dd7a84c-553f-4ee1-90e6-d7248c846853 - - - - - -] DHCP configuration for ports {'e79f26e3-191c-44ea-ba56-ba6f45e5d49c'} is completed#033[00m Oct 14 06:11:37 localhost podman[323805]: 2025-10-14 10:11:37.63212775 +0000 UTC m=+0.186034055 container start 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:37 localhost dnsmasq[323823]: started, version 2.85 cachesize 150 Oct 14 06:11:37 localhost dnsmasq[323823]: DNS service limited to local subnets Oct 14 06:11:37 localhost dnsmasq[323823]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:11:37 localhost dnsmasq[323823]: warning: no upstream servers configured Oct 14 06:11:37 localhost dnsmasq-dhcp[323823]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:11:37 localhost dnsmasq[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/addn_hosts - 0 addresses Oct 14 06:11:37 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/host Oct 14 06:11:37 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/opts Oct 14 06:11:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:37.783 271987 INFO neutron.agent.dhcp.agent [None req-bf7a86c7-dabf-4425-8e2d-d0f6387ae9f4 - - - - - -] DHCP configuration for ports {'db9053da-bc04-45ad-a46a-8d2a604064e7'} is completed#033[00m Oct 14 06:11:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e89 e89: 6 total, 6 up, 6 in Oct 14 06:11:38 localhost openstack_network_exporter[250374]: ERROR 10:11:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:11:38 localhost openstack_network_exporter[250374]: ERROR 10:11:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:11:38 localhost openstack_network_exporter[250374]: ERROR 10:11:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:11:38 localhost openstack_network_exporter[250374]: ERROR 10:11:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:11:38 localhost openstack_network_exporter[250374]: Oct 14 06:11:38 localhost openstack_network_exporter[250374]: ERROR 10:11:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:11:38 localhost openstack_network_exporter[250374]: Oct 14 06:11:39 localhost nova_compute[297686]: 2025-10-14 10:11:39.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:39 localhost nova_compute[297686]: 2025-10-14 10:11:39.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:42 localhost nova_compute[297686]: 2025-10-14 10:11:42.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:42.268 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:41Z, description=, device_id=19b78374-7670-4b0d-83ca-dff512635194, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1f8ed24e-7d2d-44ba-beee-3ba9d6328c89, ip_allocation=immediate, mac_address=fa:16:3e:e7:e0:ab, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:11:33Z, description=, dns_domain=, id=79feba7d-100b-4e64-b831-3f4a57dbb424, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1009524233-network, port_security_enabled=True, project_id=560705c462d642e4bd06d383d87d76c3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41899, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=529, status=ACTIVE, subnets=['6cd20b20-6d6e-4955-a059-e79f1c3db6db'], tags=[], tenant_id=560705c462d642e4bd06d383d87d76c3, updated_at=2025-10-14T10:11:34Z, vlan_transparent=None, network_id=79feba7d-100b-4e64-b831-3f4a57dbb424, port_security_enabled=False, project_id=560705c462d642e4bd06d383d87d76c3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=555, status=DOWN, tags=[], tenant_id=560705c462d642e4bd06d383d87d76c3, updated_at=2025-10-14T10:11:41Z on network 79feba7d-100b-4e64-b831-3f4a57dbb424#033[00m Oct 14 06:11:42 localhost systemd[1]: tmp-crun.PICMea.mount: Deactivated successfully. Oct 14 06:11:42 localhost podman[323841]: 2025-10-14 10:11:42.669172173 +0000 UTC m=+0.073506969 container kill 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:42 localhost dnsmasq[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/addn_hosts - 1 addresses Oct 14 06:11:42 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/host Oct 14 06:11:42 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/opts Oct 14 06:11:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:43.048 271987 INFO neutron.agent.dhcp.agent [None req-159512d2-15b2-45cd-ab60-18bc17254422 - - - - - -] DHCP configuration for ports {'1f8ed24e-7d2d-44ba-beee-3ba9d6328c89'} is completed#033[00m Oct 14 06:11:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:44 localhost nova_compute[297686]: 2025-10-14 10:11:44.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e90 e90: 6 total, 6 up, 6 in Oct 14 06:11:45 localhost neutron_sriov_agent[264974]: 2025-10-14 10:11:45.302 2 INFO neutron.agent.securitygroups_rpc [None req-11c3923f-ead6-4142-90fb-c63bc24f49b8 4a2c72478a7c4747a73158cd8119b6ba d6e7f435b24646ecaa54e485b818329f - - default default] Security group member updated ['08e02d40-7eb0-493a-bf38-79869188d51f']#033[00m Oct 14 06:11:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:45.473 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:41Z, description=, device_id=19b78374-7670-4b0d-83ca-dff512635194, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1f8ed24e-7d2d-44ba-beee-3ba9d6328c89, ip_allocation=immediate, mac_address=fa:16:3e:e7:e0:ab, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:11:33Z, description=, dns_domain=, id=79feba7d-100b-4e64-b831-3f4a57dbb424, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1009524233-network, port_security_enabled=True, project_id=560705c462d642e4bd06d383d87d76c3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41899, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=529, status=ACTIVE, subnets=['6cd20b20-6d6e-4955-a059-e79f1c3db6db'], tags=[], tenant_id=560705c462d642e4bd06d383d87d76c3, updated_at=2025-10-14T10:11:34Z, vlan_transparent=None, network_id=79feba7d-100b-4e64-b831-3f4a57dbb424, port_security_enabled=False, project_id=560705c462d642e4bd06d383d87d76c3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=555, status=DOWN, tags=[], tenant_id=560705c462d642e4bd06d383d87d76c3, updated_at=2025-10-14T10:11:41Z on network 79feba7d-100b-4e64-b831-3f4a57dbb424#033[00m Oct 14 06:11:45 localhost podman[323879]: 2025-10-14 10:11:45.718959271 +0000 UTC m=+0.068487735 container kill 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:11:45 localhost systemd[1]: tmp-crun.ft4abh.mount: Deactivated successfully. Oct 14 06:11:45 localhost dnsmasq[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/addn_hosts - 1 addresses Oct 14 06:11:45 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/host Oct 14 06:11:45 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/opts Oct 14 06:11:45 localhost podman[323917]: 2025-10-14 10:11:45.975668527 +0000 UTC m=+0.053188985 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 06:11:45 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:11:45 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:45 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:46.027 271987 INFO neutron.agent.dhcp.agent [None req-41aa263d-4b6d-4de1-ab46-2835c2b8ad75 - - - - - -] DHCP configuration for ports {'1f8ed24e-7d2d-44ba-beee-3ba9d6328c89'} is completed#033[00m Oct 14 06:11:46 localhost ovn_controller[157396]: 2025-10-14T10:11:46Z|00085|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:46 localhost nova_compute[297686]: 2025-10-14 10:11:46.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:47 localhost nova_compute[297686]: 2025-10-14 10:11:47.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:47 localhost neutron_sriov_agent[264974]: 2025-10-14 10:11:47.994 2 INFO neutron.agent.securitygroups_rpc [None req-9ee942bf-f520-45b5-876d-66b5a8e4d8b6 4a2c72478a7c4747a73158cd8119b6ba d6e7f435b24646ecaa54e485b818329f - - default default] Security group member updated ['08e02d40-7eb0-493a-bf38-79869188d51f']#033[00m Oct 14 06:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:11:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:48.804 271987 INFO neutron.agent.linux.ip_lib [None req-37f49419-0333-4eba-a47a-6b9d5d78f46c - - - - - -] Device tap700880b2-60 cannot be used as it has no MAC address#033[00m Oct 14 06:11:48 localhost nova_compute[297686]: 2025-10-14 10:11:48.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:48 localhost podman[323942]: 2025-10-14 10:11:48.884333068 +0000 UTC m=+0.138281028 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:11:48 localhost kernel: device tap700880b2-60 entered promiscuous mode Oct 14 06:11:48 localhost nova_compute[297686]: 2025-10-14 10:11:48.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:48 localhost NetworkManager[5977]: [1760436708.8922] manager: (tap700880b2-60): new Generic device (/org/freedesktop/NetworkManager/Devices/22) Oct 14 06:11:48 localhost ovn_controller[157396]: 2025-10-14T10:11:48Z|00086|binding|INFO|Claiming lport 700880b2-6060-40b9-8de0-d7a947508225 for this chassis. Oct 14 06:11:48 localhost ovn_controller[157396]: 2025-10-14T10:11:48Z|00087|binding|INFO|700880b2-6060-40b9-8de0-d7a947508225: Claiming unknown Oct 14 06:11:48 localhost systemd-udevd[323972]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:11:48 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:48.903 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-3e9ef511-c810-4cae-9585-029b9e5f4f80', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9ef511-c810-4cae-9585-029b9e5f4f80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c98ade2ddf84e768525191c30cded08', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=044c5ddd-6d9f-4b58-84de-547195ca389f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=700880b2-6060-40b9-8de0-d7a947508225) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:48 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:48.905 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 700880b2-6060-40b9-8de0-d7a947508225 in datapath 3e9ef511-c810-4cae-9585-029b9e5f4f80 bound to our chassis#033[00m Oct 14 06:11:48 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:48.907 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3e9ef511-c810-4cae-9585-029b9e5f4f80 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:11:48 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:48.908 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[da47d158-1ed4-4345-b6d6-2861ff9941e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:48 localhost nova_compute[297686]: 2025-10-14 10:11:48.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:48 localhost ovn_controller[157396]: 2025-10-14T10:11:48Z|00088|binding|INFO|Setting lport 700880b2-6060-40b9-8de0-d7a947508225 ovn-installed in OVS Oct 14 06:11:48 localhost ovn_controller[157396]: 2025-10-14T10:11:48Z|00089|binding|INFO|Setting lport 700880b2-6060-40b9-8de0-d7a947508225 up in Southbound Oct 14 06:11:48 localhost nova_compute[297686]: 2025-10-14 10:11:48.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:48 localhost podman[323942]: 2025-10-14 10:11:48.932150397 +0000 UTC m=+0.186098407 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:11:48 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:11:48 localhost nova_compute[297686]: 2025-10-14 10:11:48.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:48 localhost nova_compute[297686]: 2025-10-14 10:11:48.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:49 localhost podman[323970]: 2025-10-14 10:11:49.032880682 +0000 UTC m=+0.136393051 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Oct 14 06:11:49 localhost podman[323970]: 2025-10-14 10:11:49.068204017 +0000 UTC m=+0.171716386 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:11:49 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:11:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:49 localhost nova_compute[297686]: 2025-10-14 10:11:49.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:49 localhost dnsmasq[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/addn_hosts - 0 addresses Oct 14 06:11:49 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/host Oct 14 06:11:49 localhost dnsmasq-dhcp[323823]: read /var/lib/neutron/dhcp/79feba7d-100b-4e64-b831-3f4a57dbb424/opts Oct 14 06:11:49 localhost podman[324037]: 2025-10-14 10:11:49.677635698 +0000 UTC m=+0.089326565 container kill 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:11:49 localhost podman[324081]: Oct 14 06:11:49 localhost podman[324081]: 2025-10-14 10:11:49.967546184 +0000 UTC m=+0.082505056 container create f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:50 localhost nova_compute[297686]: 2025-10-14 10:11:50.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:50 localhost kernel: device tapa1a14cfe-e5 left promiscuous mode Oct 14 06:11:50 localhost ovn_controller[157396]: 2025-10-14T10:11:50Z|00090|binding|INFO|Releasing lport a1a14cfe-e5bf-4bfd-a032-20e19d47a859 from this chassis (sb_readonly=0) Oct 14 06:11:50 localhost ovn_controller[157396]: 2025-10-14T10:11:50Z|00091|binding|INFO|Setting lport a1a14cfe-e5bf-4bfd-a032-20e19d47a859 down in Southbound Oct 14 06:11:50 localhost podman[324081]: 2025-10-14 10:11:49.923336626 +0000 UTC m=+0.038295468 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:11:50 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:50.026 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-79feba7d-100b-4e64-b831-3f4a57dbb424', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79feba7d-100b-4e64-b831-3f4a57dbb424', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '560705c462d642e4bd06d383d87d76c3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6664e9ef-d580-4f24-ad72-eb8f813819f2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a1a14cfe-e5bf-4bfd-a032-20e19d47a859) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:50 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:50.027 163055 INFO neutron.agent.ovn.metadata.agent [-] Port a1a14cfe-e5bf-4bfd-a032-20e19d47a859 in datapath 79feba7d-100b-4e64-b831-3f4a57dbb424 unbound from our chassis#033[00m Oct 14 06:11:50 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:50.030 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79feba7d-100b-4e64-b831-3f4a57dbb424, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:11:50 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:50.031 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[4b98b5bf-d7eb-4935-a235-81e0f4af739b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:50 localhost nova_compute[297686]: 2025-10-14 10:11:50.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:50 localhost systemd[1]: Started libpod-conmon-f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81.scope. Oct 14 06:11:50 localhost systemd[1]: Started libcrun container. Oct 14 06:11:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac115fb0935abbb3ca181d30347e4084bbf570c8b20395f2462cd287aa203687/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:11:50 localhost podman[324081]: 2025-10-14 10:11:50.077520083 +0000 UTC m=+0.192478945 container init f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:11:50 localhost podman[324081]: 2025-10-14 10:11:50.086806067 +0000 UTC m=+0.201764929 container start f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:11:50 localhost dnsmasq[324101]: started, version 2.85 cachesize 150 Oct 14 06:11:50 localhost dnsmasq[324101]: DNS service limited to local subnets Oct 14 06:11:50 localhost dnsmasq[324101]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:11:50 localhost dnsmasq[324101]: warning: no upstream servers configured Oct 14 06:11:50 localhost dnsmasq-dhcp[324101]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:11:50 localhost dnsmasq[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/addn_hosts - 0 addresses Oct 14 06:11:50 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/host Oct 14 06:11:50 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/opts Oct 14 06:11:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:50.212 271987 INFO neutron.agent.dhcp.agent [None req-1f7a1a87-fccf-4043-aef4-e149259bec1e - - - - - -] DHCP configuration for ports {'ac0cfc43-2d0c-4fba-a590-063a8131bd86'} is completed#033[00m Oct 14 06:11:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:50.507 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:50Z, description=, device_id=d7f83b60-33c8-46bc-9877-cdf253390e07, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5a23c93f-6748-429d-a5c0-758fd6b27470, ip_allocation=immediate, mac_address=fa:16:3e:71:75:23, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=577, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:11:50Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:11:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e91 e91: 6 total, 6 up, 6 in Oct 14 06:11:50 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 7 addresses Oct 14 06:11:50 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:50 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:50 localhost podman[324119]: 2025-10-14 10:11:50.70214355 +0000 UTC m=+0.044497108 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 06:11:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:50.856 271987 INFO neutron.agent.dhcp.agent [None req-f26567da-c819-4d8f-99ef-5ea059987df9 - - - - - -] DHCP configuration for ports {'5a23c93f-6748-429d-a5c0-758fd6b27470'} is completed#033[00m Oct 14 06:11:51 localhost nova_compute[297686]: 2025-10-14 10:11:51.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:52 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:11:52 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:52 localhost podman[324156]: 2025-10-14 10:11:52.030462086 +0000 UTC m=+0.093120082 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 06:11:52 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:52 localhost ovn_controller[157396]: 2025-10-14T10:11:52Z|00092|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:52 localhost nova_compute[297686]: 2025-10-14 10:11:52.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:52 localhost nova_compute[297686]: 2025-10-14 10:11:52.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:52 localhost ovn_controller[157396]: 2025-10-14T10:11:52Z|00093|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:52 localhost systemd[1]: tmp-crun.ertYWg.mount: Deactivated successfully. Oct 14 06:11:52 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 5 addresses Oct 14 06:11:52 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:52 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:52 localhost podman[324194]: 2025-10-14 10:11:52.504839078 +0000 UTC m=+0.065330878 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:11:52 localhost nova_compute[297686]: 2025-10-14 10:11:52.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:52 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:52.679 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:52Z, description=, device_id=d7f83b60-33c8-46bc-9877-cdf253390e07, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a20ae385-8386-4015-814f-9819ab49f912, ip_allocation=immediate, mac_address=fa:16:3e:0d:8c:36, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:11:47Z, description=, dns_domain=, id=3e9ef511-c810-4cae-9585-029b9e5f4f80, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-509392501-network, port_security_enabled=True, project_id=8c98ade2ddf84e768525191c30cded08, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36010, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=571, status=ACTIVE, subnets=['eeebb122-f6b9-43a7-bbde-0cf4283bb474'], tags=[], tenant_id=8c98ade2ddf84e768525191c30cded08, updated_at=2025-10-14T10:11:48Z, vlan_transparent=None, network_id=3e9ef511-c810-4cae-9585-029b9e5f4f80, port_security_enabled=False, project_id=8c98ade2ddf84e768525191c30cded08, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=578, status=DOWN, tags=[], tenant_id=8c98ade2ddf84e768525191c30cded08, updated_at=2025-10-14T10:11:52Z on network 3e9ef511-c810-4cae-9585-029b9e5f4f80#033[00m Oct 14 06:11:52 localhost dnsmasq[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/addn_hosts - 1 addresses Oct 14 06:11:52 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/host Oct 14 06:11:52 localhost podman[324231]: 2025-10-14 10:11:52.897433708 +0000 UTC m=+0.053676690 container kill f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:11:52 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/opts Oct 14 06:11:53 localhost systemd[1]: tmp-crun.10AtmS.mount: Deactivated successfully. Oct 14 06:11:53 localhost dnsmasq[323823]: exiting on receipt of SIGTERM Oct 14 06:11:53 localhost podman[324267]: 2025-10-14 10:11:53.064785459 +0000 UTC m=+0.043611950 container kill 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:11:53 localhost systemd[1]: libpod-1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552.scope: Deactivated successfully. Oct 14 06:11:53 localhost podman[324281]: 2025-10-14 10:11:53.12438489 +0000 UTC m=+0.044440215 container died 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:11:53 localhost systemd[1]: tmp-crun.oYbRy7.mount: Deactivated successfully. Oct 14 06:11:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:53.145 271987 INFO neutron.agent.dhcp.agent [None req-ff333c1d-c975-476d-9cba-5cf06c68e937 - - - - - -] DHCP configuration for ports {'a20ae385-8386-4015-814f-9819ab49f912'} is completed#033[00m Oct 14 06:11:53 localhost podman[324281]: 2025-10-14 10:11:53.154766683 +0000 UTC m=+0.074821998 container cleanup 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:11:53 localhost systemd[1]: libpod-conmon-1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552.scope: Deactivated successfully. Oct 14 06:11:53 localhost podman[324282]: 2025-10-14 10:11:53.202820349 +0000 UTC m=+0.118330205 container remove 1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79feba7d-100b-4e64-b831-3f4a57dbb424, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 06:11:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:53.229 271987 INFO neutron.agent.dhcp.agent [None req-6c54f2be-8e0a-4158-b04d-f09cebd7de05 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:11:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:53.518 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:11:54 localhost systemd[1]: tmp-crun.ScCc19.mount: Deactivated successfully. Oct 14 06:11:54 localhost systemd[1]: var-lib-containers-storage-overlay-7eeabbf162c87724632e694f0871b1c4b2b0eedcb0891391b8c90b8326e4158d-merged.mount: Deactivated successfully. Oct 14 06:11:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1130219dc0a3ce5264181ad72c68ae1294af021486bd8755969cfaec3a0c1552-userdata-shm.mount: Deactivated successfully. Oct 14 06:11:54 localhost systemd[1]: run-netns-qdhcp\x2d79feba7d\x2d100b\x2d4e64\x2db831\x2d3f4a57dbb424.mount: Deactivated successfully. Oct 14 06:11:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:11:54.087 2 INFO neutron.agent.securitygroups_rpc [None req-cfceca67-8a1d-4506-8406-61d97cd102f7 d6d06f9c969f4b25a388e6b1f8e79df2 4a912863089b4050b50010417538a2b4 - - default default] Security group member updated ['f4a71cc4-401e-4fd9-a76d-664285c1f988']#033[00m Oct 14 06:11:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:54.366 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:11:52Z, description=, device_id=d7f83b60-33c8-46bc-9877-cdf253390e07, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a20ae385-8386-4015-814f-9819ab49f912, ip_allocation=immediate, mac_address=fa:16:3e:0d:8c:36, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:11:47Z, description=, dns_domain=, id=3e9ef511-c810-4cae-9585-029b9e5f4f80, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-509392501-network, port_security_enabled=True, project_id=8c98ade2ddf84e768525191c30cded08, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36010, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=571, status=ACTIVE, subnets=['eeebb122-f6b9-43a7-bbde-0cf4283bb474'], tags=[], tenant_id=8c98ade2ddf84e768525191c30cded08, updated_at=2025-10-14T10:11:48Z, vlan_transparent=None, network_id=3e9ef511-c810-4cae-9585-029b9e5f4f80, port_security_enabled=False, project_id=8c98ade2ddf84e768525191c30cded08, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=578, status=DOWN, tags=[], tenant_id=8c98ade2ddf84e768525191c30cded08, updated_at=2025-10-14T10:11:52Z on network 3e9ef511-c810-4cae-9585-029b9e5f4f80#033[00m Oct 14 06:11:54 localhost nova_compute[297686]: 2025-10-14 10:11:54.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:54 localhost dnsmasq[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/addn_hosts - 1 addresses Oct 14 06:11:54 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/host Oct 14 06:11:54 localhost podman[324324]: 2025-10-14 10:11:54.630000432 +0000 UTC m=+0.069054043 container kill f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:11:54 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/opts Oct 14 06:11:54 localhost systemd[1]: tmp-crun.5dyQJi.mount: Deactivated successfully. Oct 14 06:11:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:11:54.925 271987 INFO neutron.agent.dhcp.agent [None req-84dbf617-907b-453b-a220-2d8abca0c73d - - - - - -] DHCP configuration for ports {'a20ae385-8386-4015-814f-9819ab49f912'} is completed#033[00m Oct 14 06:11:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:11:55.738 2 INFO neutron.agent.securitygroups_rpc [None req-ab79e25f-ba50-4f71-a6eb-077d18b144c8 d6d06f9c969f4b25a388e6b1f8e79df2 4a912863089b4050b50010417538a2b4 - - default default] Security group member updated ['f4a71cc4-401e-4fd9-a76d-664285c1f988']#033[00m Oct 14 06:11:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:11:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:11:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:11:56 localhost ovn_controller[157396]: 2025-10-14T10:11:56Z|00094|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:56 localhost nova_compute[297686]: 2025-10-14 10:11:56.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:56 localhost podman[324357]: 2025-10-14 10:11:56.788421467 +0000 UTC m=+0.125231928 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:11:56 localhost podman[324357]: 2025-10-14 10:11:56.797995401 +0000 UTC m=+0.134805832 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:11:56 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:11:56 localhost podman[324355]: 2025-10-14 10:11:56.83833609 +0000 UTC m=+0.179195465 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:11:56 localhost podman[324355]: 2025-10-14 10:11:56.847009447 +0000 UTC m=+0.187868822 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:11:56 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:11:56 localhost podman[324362]: 2025-10-14 10:11:56.766324919 +0000 UTC m=+0.095193465 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Oct 14 06:11:56 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:11:56 localhost podman[324394]: 2025-10-14 10:11:56.887821491 +0000 UTC m=+0.153222158 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:11:56 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:56 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:56 localhost podman[324362]: 2025-10-14 10:11:56.903023997 +0000 UTC m=+0.231892523 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:11:56 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:11:57 localhost nova_compute[297686]: 2025-10-14 10:11:57.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:57 localhost dnsmasq[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/addn_hosts - 0 addresses Oct 14 06:11:57 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/host Oct 14 06:11:57 localhost podman[324459]: 2025-10-14 10:11:57.683206395 +0000 UTC m=+0.057613431 container kill f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:11:57 localhost dnsmasq-dhcp[324101]: read /var/lib/neutron/dhcp/3e9ef511-c810-4cae-9585-029b9e5f4f80/opts Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.781 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.781 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.782 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:11:57 localhost nova_compute[297686]: 2025-10-14 10:11:57.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:57 localhost ovn_controller[157396]: 2025-10-14T10:11:57Z|00095|binding|INFO|Releasing lport 700880b2-6060-40b9-8de0-d7a947508225 from this chassis (sb_readonly=0) Oct 14 06:11:57 localhost kernel: device tap700880b2-60 left promiscuous mode Oct 14 06:11:57 localhost ovn_controller[157396]: 2025-10-14T10:11:57Z|00096|binding|INFO|Setting lport 700880b2-6060-40b9-8de0-d7a947508225 down in Southbound Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.843 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-3e9ef511-c810-4cae-9585-029b9e5f4f80', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9ef511-c810-4cae-9585-029b9e5f4f80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c98ade2ddf84e768525191c30cded08', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=044c5ddd-6d9f-4b58-84de-547195ca389f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=700880b2-6060-40b9-8de0-d7a947508225) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.845 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 700880b2-6060-40b9-8de0-d7a947508225 in datapath 3e9ef511-c810-4cae-9585-029b9e5f4f80 unbound from our chassis#033[00m Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.848 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e9ef511-c810-4cae-9585-029b9e5f4f80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:11:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:11:57.849 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[9ceb6f32-5e16-458a-9f35-5aeca152b5d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:11:57 localhost nova_compute[297686]: 2025-10-14 10:11:57.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:58 localhost podman[248187]: time="2025-10-14T10:11:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:11:58 localhost podman[248187]: @ - - [14/Oct/2025:10:11:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149319 "" "Go-http-client/1.1" Oct 14 06:11:58 localhost podman[248187]: @ - - [14/Oct/2025:10:11:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20328 "" "Go-http-client/1.1" Oct 14 06:11:59 localhost ovn_controller[157396]: 2025-10-14T10:11:59Z|00097|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:59 localhost systemd[1]: tmp-crun.dy2YSl.mount: Deactivated successfully. Oct 14 06:11:59 localhost podman[324500]: 2025-10-14 10:11:59.100805883 +0000 UTC m=+0.047199302 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:11:59 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:11:59 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:59 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:59 localhost nova_compute[297686]: 2025-10-14 10:11:59.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:11:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e92 e92: 6 total, 6 up, 6 in Oct 14 06:11:59 localhost nova_compute[297686]: 2025-10-14 10:11:59.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:11:59 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:11:59 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:11:59 localhost podman[324537]: 2025-10-14 10:11:59.725045949 +0000 UTC m=+0.043133477 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:11:59 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:11:59 localhost ovn_controller[157396]: 2025-10-14T10:11:59Z|00098|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:11:59 localhost nova_compute[297686]: 2025-10-14 10:11:59.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:00 localhost dnsmasq[324101]: exiting on receipt of SIGTERM Oct 14 06:12:00 localhost podman[324573]: 2025-10-14 10:12:00.544132311 +0000 UTC m=+0.048890213 container kill f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:12:00 localhost systemd[1]: libpod-f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81.scope: Deactivated successfully. Oct 14 06:12:00 localhost podman[324586]: 2025-10-14 10:12:00.605428574 +0000 UTC m=+0.050989437 container died f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 06:12:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81-userdata-shm.mount: Deactivated successfully. Oct 14 06:12:00 localhost podman[324586]: 2025-10-14 10:12:00.685967428 +0000 UTC m=+0.131528251 container cleanup f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:12:00 localhost systemd[1]: libpod-conmon-f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81.scope: Deactivated successfully. Oct 14 06:12:00 localhost podman[324589]: 2025-10-14 10:12:00.709088068 +0000 UTC m=+0.142522569 container remove f9afd4b5673de68a042e150cd0e844ab370730a0d74ab895aee423053e5aff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3e9ef511-c810-4cae-9585-029b9e5f4f80, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2) Oct 14 06:12:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:00.714 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:12:00Z, description=, device_id=82ce39a3-0c7e-4492-9620-2979cac3b1f9, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=95b19dca-ed45-42b3-9301-8de2a72d2921, ip_allocation=immediate, mac_address=fa:16:3e:ca:87:8e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=616, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:12:00Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:12:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:00.753 271987 INFO neutron.agent.dhcp.agent [None req-1262df49-da0e-489a-bd8d-9f53f619b638 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:12:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:00.811 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:12:00 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:12:00 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:12:00 localhost podman[324634]: 2025-10-14 10:12:00.897810115 +0000 UTC m=+0.042698582 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 06:12:00 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:12:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:01.114 271987 INFO neutron.agent.dhcp.agent [None req-98badeb6-50d1-4559-87ee-06b4e24fa35d - - - - - -] DHCP configuration for ports {'95b19dca-ed45-42b3-9301-8de2a72d2921'} is completed#033[00m Oct 14 06:12:01 localhost systemd[1]: var-lib-containers-storage-overlay-ac115fb0935abbb3ca181d30347e4084bbf570c8b20395f2462cd287aa203687-merged.mount: Deactivated successfully. Oct 14 06:12:01 localhost systemd[1]: run-netns-qdhcp\x2d3e9ef511\x2dc810\x2d4cae\x2d9585\x2d029b9e5f4f80.mount: Deactivated successfully. Oct 14 06:12:01 localhost nova_compute[297686]: 2025-10-14 10:12:01.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:02 localhost nova_compute[297686]: 2025-10-14 10:12:02.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:02 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:12:02 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:12:02 localhost podman[324671]: 2025-10-14 10:12:02.892562782 +0000 UTC m=+0.073720465 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:12:02 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:12:03 localhost ovn_controller[157396]: 2025-10-14T10:12:03Z|00099|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:12:03 localhost nova_compute[297686]: 2025-10-14 10:12:03.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:04 localhost nova_compute[297686]: 2025-10-14 10:12:04.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:05 localhost nova_compute[297686]: 2025-10-14 10:12:05.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e93 e93: 6 total, 6 up, 6 in Oct 14 06:12:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:12:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:12:06 localhost podman[324692]: 2025-10-14 10:12:06.753735494 +0000 UTC m=+0.086665523 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:12:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:12:06 localhost podman[324692]: 2025-10-14 10:12:06.775503902 +0000 UTC m=+0.108433851 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:12:06 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:12:06 localhost podman[324693]: 2025-10-14 10:12:06.86558632 +0000 UTC m=+0.193273598 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:12:06 localhost podman[324693]: 2025-10-14 10:12:06.876076972 +0000 UTC m=+0.203764210 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute) Oct 14 06:12:06 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:12:06 localhost podman[324723]: 2025-10-14 10:12:06.9186406 +0000 UTC m=+0.134583016 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:12:06 localhost podman[324723]: 2025-10-14 10:12:06.986210706 +0000 UTC m=+0.202153122 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:12:06 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:12:07 localhost nova_compute[297686]: 2025-10-14 10:12:07.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e94 e94: 6 total, 6 up, 6 in Oct 14 06:12:08 localhost openstack_network_exporter[250374]: ERROR 10:12:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:12:08 localhost openstack_network_exporter[250374]: ERROR 10:12:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:12:08 localhost openstack_network_exporter[250374]: ERROR 10:12:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:12:08 localhost openstack_network_exporter[250374]: ERROR 10:12:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:12:08 localhost openstack_network_exporter[250374]: Oct 14 06:12:08 localhost openstack_network_exporter[250374]: ERROR 10:12:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:12:08 localhost openstack_network_exporter[250374]: Oct 14 06:12:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:09 localhost nova_compute[297686]: 2025-10-14 10:12:09.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e95 e95: 6 total, 6 up, 6 in Oct 14 06:12:11 localhost ovn_controller[157396]: 2025-10-14T10:12:11Z|00100|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:12:11 localhost nova_compute[297686]: 2025-10-14 10:12:11.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:11 localhost podman[324770]: 2025-10-14 10:12:11.786322392 +0000 UTC m=+0.089008866 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:12:11 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:12:11 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:12:11 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:12:11 localhost systemd[1]: tmp-crun.v66fuz.mount: Deactivated successfully. Oct 14 06:12:12 localhost nova_compute[297686]: 2025-10-14 10:12:12.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:14 localhost nova_compute[297686]: 2025-10-14 10:12:14.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e96 e96: 6 total, 6 up, 6 in Oct 14 06:12:17 localhost nova_compute[297686]: 2025-10-14 10:12:17.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:18 localhost nova_compute[297686]: 2025-10-14 10:12:18.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:18.501 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:12:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:18.503 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:12:18 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e97 e97: 6 total, 6 up, 6 in Oct 14 06:12:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.551949) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436739551989, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1786, "num_deletes": 257, "total_data_size": 2159271, "memory_usage": 2193424, "flush_reason": "Manual Compaction"} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Oct 14 06:12:19 localhost nova_compute[297686]: 2025-10-14 10:12:19.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436739560811, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1408041, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16506, "largest_seqno": 18287, "table_properties": {"data_size": 1401362, "index_size": 3765, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14770, "raw_average_key_size": 20, "raw_value_size": 1387502, "raw_average_value_size": 1895, "num_data_blocks": 166, "num_entries": 732, "num_filter_entries": 732, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436619, "oldest_key_time": 1760436619, "file_creation_time": 1760436739, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8928 microseconds, and 5074 cpu microseconds. Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.560871) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1408041 bytes OK Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.560897) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.564251) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.564282) EVENT_LOG_v1 {"time_micros": 1760436739564274, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.564308) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2151036, prev total WAL file size 2151360, number of live WAL files 2. Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.565091) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303134' seq:72057594037927935, type:22 .. '6C6F676D0034323635' seq:0, type:0; will stop at (end) Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1375KB)], [21(17MB)] Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436739565169, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19859241, "oldest_snapshot_seqno": -1} Oct 14 06:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12484 keys, 19721192 bytes, temperature: kUnknown Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436739656984, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 19721192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19648069, "index_size": 40830, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31237, "raw_key_size": 336814, "raw_average_key_size": 26, "raw_value_size": 19433379, "raw_average_value_size": 1556, "num_data_blocks": 1545, "num_entries": 12484, "num_filter_entries": 12484, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436739, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.657352) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 19721192 bytes Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.659109) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 216.1 rd, 214.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 17.6 +0.0 blob) out(18.8 +0.0 blob), read-write-amplify(28.1) write-amplify(14.0) OK, records in: 13018, records dropped: 534 output_compression: NoCompression Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.659138) EVENT_LOG_v1 {"time_micros": 1760436739659125, "job": 10, "event": "compaction_finished", "compaction_time_micros": 91896, "compaction_time_cpu_micros": 54011, "output_level": 6, "num_output_files": 1, "total_output_size": 19721192, "num_input_records": 13018, "num_output_records": 12484, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436739659492, "job": 10, "event": "table_file_deletion", "file_number": 23} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436739662228, "job": 10, "event": "table_file_deletion", "file_number": 21} Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.564965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.662306) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.662312) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.662314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.662315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:19 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:19.662316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:19 localhost podman[324792]: 2025-10-14 10:12:19.75333576 +0000 UTC m=+0.090386938 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:12:19 localhost podman[324792]: 2025-10-14 10:12:19.789285364 +0000 UTC m=+0.126336612 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:12:19 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:12:19 localhost podman[324793]: 2025-10-14 10:12:19.869137227 +0000 UTC m=+0.202958846 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:12:19 localhost podman[324793]: 2025-10-14 10:12:19.90308153 +0000 UTC m=+0.236903089 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:12:19 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:12:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:20.108 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:12:19Z, description=, device_id=2c392c3e-0432-4315-afb1-592d71fb8e43, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=608d193e-b40e-42ce-95e2-532a86f20043, ip_allocation=immediate, mac_address=fa:16:3e:b5:07:c5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=704, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:12:19Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:12:20 localhost podman[324893]: 2025-10-14 10:12:20.33864203 +0000 UTC m=+0.048284784 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:12:20 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:12:20 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:12:20 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:12:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:20.594 271987 INFO neutron.agent.dhcp.agent [None req-9978ea40-dddc-4bcb-a5aa-b4e67595c05f - - - - - -] DHCP configuration for ports {'608d193e-b40e-42ce-95e2-532a86f20043'} is completed#033[00m Oct 14 06:12:20 localhost nova_compute[297686]: 2025-10-14 10:12:20.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:21 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:12:21 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:12:22 localhost nova_compute[297686]: 2025-10-14 10:12:22.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:23 localhost nova_compute[297686]: 2025-10-14 10:12:23.268 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:23 localhost nova_compute[297686]: 2025-10-14 10:12:23.268 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:23 localhost nova_compute[297686]: 2025-10-14 10:12:23.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:24 localhost nova_compute[297686]: 2025-10-14 10:12:24.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e98 e98: 6 total, 6 up, 6 in Oct 14 06:12:25 localhost nova_compute[297686]: 2025-10-14 10:12:25.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:25 localhost nova_compute[297686]: 2025-10-14 10:12:25.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:12:25 localhost nova_compute[297686]: 2025-10-14 10:12:25.258 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:12:25 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:12:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:25.393 2 INFO neutron.agent.securitygroups_rpc [None req-5fb6e895-0b72-4fd9-afb2-e468fd4c9d8e a5c8b032521c4660a9f50471da931c3a 67facb686b1a45e4af5a7329836978ce - - default default] Security group rule updated ['c2c1552c-9248-46c1-8391-9c390debaa3c']#033[00m Oct 14 06:12:25 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:25.505 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:12:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:25.583 2 INFO neutron.agent.securitygroups_rpc [None req-bf8acc07-8506-4dbe-a875-499669ac567e a5c8b032521c4660a9f50471da931c3a 67facb686b1a45e4af5a7329836978ce - - default default] Security group rule updated ['c2c1552c-9248-46c1-8391-9c390debaa3c']#033[00m Oct 14 06:12:26 localhost nova_compute[297686]: 2025-10-14 10:12:26.611 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:12:26 localhost nova_compute[297686]: 2025-10-14 10:12:26.612 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:12:26 localhost nova_compute[297686]: 2025-10-14 10:12:26.612 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:12:26 localhost nova_compute[297686]: 2025-10-14 10:12:26.612 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.323 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.345 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.345 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.346 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.347 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:27 localhost nova_compute[297686]: 2025-10-14 10:12:27.347 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:12:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:12:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:12:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:12:27 localhost systemd[1]: tmp-crun.YKJHRs.mount: Deactivated successfully. Oct 14 06:12:27 localhost podman[324957]: 2025-10-14 10:12:27.754878089 +0000 UTC m=+0.090127500 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd) Oct 14 06:12:27 localhost podman[324957]: 2025-10-14 10:12:27.796313662 +0000 UTC m=+0.131563093 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:12:27 localhost podman[324958]: 2025-10-14 10:12:27.798491559 +0000 UTC m=+0.132721668 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:12:27 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:12:27 localhost podman[324959]: 2025-10-14 10:12:27.847096792 +0000 UTC m=+0.178746413 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible) Oct 14 06:12:27 localhost podman[324959]: 2025-10-14 10:12:27.883074637 +0000 UTC m=+0.214724238 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:12:27 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:12:27 localhost podman[324958]: 2025-10-14 10:12:27.933856667 +0000 UTC m=+0.268086846 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:12:27 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:12:28 localhost nova_compute[297686]: 2025-10-14 10:12:28.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:28 localhost nova_compute[297686]: 2025-10-14 10:12:28.258 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:28 localhost podman[248187]: time="2025-10-14T10:12:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:12:28 localhost podman[248187]: @ - - [14/Oct/2025:10:12:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:12:28 localhost podman[248187]: @ - - [14/Oct/2025:10:12:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19855 "" "Go-http-client/1.1" Oct 14 06:12:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:29 localhost nova_compute[297686]: 2025-10-14 10:12:29.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:29 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:29.971 2 INFO neutron.agent.securitygroups_rpc [req-71942470-9079-4931-bb18-878b256d4354 req-27095b53-a69d-4785-aaad-da6bebb4cf09 a5c8b032521c4660a9f50471da931c3a 67facb686b1a45e4af5a7329836978ce - - default default] Security group member updated ['c2c1552c-9248-46c1-8391-9c390debaa3c']#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.252 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.275 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.296 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.296 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:12:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:12:30 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2293324535' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.752 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.815 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:12:30 localhost nova_compute[297686]: 2025-10-14 10:12:30.815 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.003 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.005 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11337MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.006 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.006 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.079 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.080 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.080 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.135 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:12:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e99 e99: 6 total, 6 up, 6 in Oct 14 06:12:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:12:31 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2389991937' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.605 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.611 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.627 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.652 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:12:31 localhost nova_compute[297686]: 2025-10-14 10:12:31.652 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:12:32 localhost nova_compute[297686]: 2025-10-14 10:12:32.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:33 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e100 e100: 6 total, 6 up, 6 in Oct 14 06:12:33 localhost nova_compute[297686]: 2025-10-14 10:12:33.633 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:12:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e101 e101: 6 total, 6 up, 6 in Oct 14 06:12:34 localhost nova_compute[297686]: 2025-10-14 10:12:34.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:37 localhost nova_compute[297686]: 2025-10-14 10:12:37.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:12:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:12:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:12:37 localhost podman[325063]: 2025-10-14 10:12:37.734648631 +0000 UTC m=+0.075147249 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Oct 14 06:12:37 localhost podman[325063]: 2025-10-14 10:12:37.751066196 +0000 UTC m=+0.091564844 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public) Oct 14 06:12:37 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:12:37 localhost systemd[1]: tmp-crun.YrceDN.mount: Deactivated successfully. Oct 14 06:12:37 localhost podman[325064]: 2025-10-14 10:12:37.841818323 +0000 UTC m=+0.172939154 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 06:12:37 localhost podman[325062]: 2025-10-14 10:12:37.855924836 +0000 UTC m=+0.196697403 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:12:37 localhost podman[325064]: 2025-10-14 10:12:37.907828331 +0000 UTC m=+0.238949212 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 06:12:37 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:12:37 localhost podman[325062]: 2025-10-14 10:12:37.927203747 +0000 UTC m=+0.267976314 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller) Oct 14 06:12:37 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:12:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e102 e102: 6 total, 6 up, 6 in Oct 14 06:12:38 localhost systemd[1]: tmp-crun.AgSt72.mount: Deactivated successfully. Oct 14 06:12:38 localhost openstack_network_exporter[250374]: ERROR 10:12:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:12:38 localhost openstack_network_exporter[250374]: ERROR 10:12:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:12:38 localhost openstack_network_exporter[250374]: ERROR 10:12:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:12:38 localhost openstack_network_exporter[250374]: ERROR 10:12:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:12:38 localhost openstack_network_exporter[250374]: Oct 14 06:12:38 localhost openstack_network_exporter[250374]: ERROR 10:12:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:12:38 localhost openstack_network_exporter[250374]: Oct 14 06:12:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e103 e103: 6 total, 6 up, 6 in Oct 14 06:12:39 localhost nova_compute[297686]: 2025-10-14 10:12:39.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e104 e104: 6 total, 6 up, 6 in Oct 14 06:12:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:12:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 7155 writes, 30K keys, 7155 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s#012Cumulative WAL: 7155 writes, 1605 syncs, 4.46 writes per sync, written: 0.03 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2092 writes, 7814 keys, 2092 commit groups, 1.0 writes per commit group, ingest: 8.19 MB, 0.01 MB/s#012Interval WAL: 2092 writes, 915 syncs, 2.29 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 06:12:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e105 e105: 6 total, 6 up, 6 in Oct 14 06:12:42 localhost nova_compute[297686]: 2025-10-14 10:12:42.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:42 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e106 e106: 6 total, 6 up, 6 in Oct 14 06:12:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:44 localhost nova_compute[297686]: 2025-10-14 10:12:44.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e107 e107: 6 total, 6 up, 6 in Oct 14 06:12:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:12:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 8854 writes, 36K keys, 8854 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s#012Cumulative WAL: 8853 writes, 2166 syncs, 4.09 writes per sync, written: 0.03 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3178 writes, 11K keys, 3178 commit groups, 1.0 writes per commit group, ingest: 13.32 MB, 0.02 MB/s#012Interval WAL: 3177 writes, 1342 syncs, 2.37 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 06:12:47 localhost nova_compute[297686]: 2025-10-14 10:12:47.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:47.556 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:12:47Z, description=, device_id=cd458f74-59aa-4484-a529-3365c9369c99, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=945a5e17-199f-4982-8462-284d8f4975ce, ip_allocation=immediate, mac_address=fa:16:3e:04:e6:07, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=862, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:12:47Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:12:47 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:12:47 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:12:47 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:12:47 localhost podman[325139]: 2025-10-14 10:12:47.787706913 +0000 UTC m=+0.061283424 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009) Oct 14 06:12:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:48.089 271987 INFO neutron.agent.dhcp.agent [None req-dfd48282-da5a-40db-a78b-64b302f2cbf0 - - - - - -] DHCP configuration for ports {'945a5e17-199f-4982-8462-284d8f4975ce'} is completed#033[00m Oct 14 06:12:48 localhost nova_compute[297686]: 2025-10-14 10:12:48.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.255821) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436769255885, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 773, "num_deletes": 255, "total_data_size": 851333, "memory_usage": 865352, "flush_reason": "Manual Compaction"} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436769260378, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 556409, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18292, "largest_seqno": 19060, "table_properties": {"data_size": 552776, "index_size": 1424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9140, "raw_average_key_size": 20, "raw_value_size": 545236, "raw_average_value_size": 1244, "num_data_blocks": 62, "num_entries": 438, "num_filter_entries": 438, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436739, "oldest_key_time": 1760436739, "file_creation_time": 1760436769, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 4605 microseconds, and 2007 cpu microseconds. Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.260430) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 556409 bytes OK Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.260455) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.262627) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.262643) EVENT_LOG_v1 {"time_micros": 1760436769262638, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.262665) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 847155, prev total WAL file size 847155, number of live WAL files 2. Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.263326) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(543KB)], [24(18MB)] Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436769263411, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20277601, "oldest_snapshot_seqno": -1} Oct 14 06:12:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12394 keys, 16999382 bytes, temperature: kUnknown Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436769376556, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 16999382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16928475, "index_size": 38806, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31045, "raw_key_size": 335575, "raw_average_key_size": 27, "raw_value_size": 16716892, "raw_average_value_size": 1348, "num_data_blocks": 1456, "num_entries": 12394, "num_filter_entries": 12394, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436769, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.377039) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 16999382 bytes Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.378568) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.9 rd, 150.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 18.8 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(67.0) write-amplify(30.6) OK, records in: 12922, records dropped: 528 output_compression: NoCompression Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.378605) EVENT_LOG_v1 {"time_micros": 1760436769378586, "job": 12, "event": "compaction_finished", "compaction_time_micros": 113345, "compaction_time_cpu_micros": 47804, "output_level": 6, "num_output_files": 1, "total_output_size": 16999382, "num_input_records": 12922, "num_output_records": 12394, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436769378897, "job": 12, "event": "table_file_deletion", "file_number": 26} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436769383942, "job": 12, "event": "table_file_deletion", "file_number": 24} Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.263196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.384039) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.384049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.384053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.384057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:49 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:12:49.384060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:12:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:49.493 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:12:49Z, description=, device_id=3d72ab9a-51f6-4cc3-a52f-4cf6d093ccf2, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=60101be1-7a53-4653-95e9-55bbff000eb5, ip_allocation=immediate, mac_address=fa:16:3e:00:72:0d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=864, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:12:49Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:12:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e108 e108: 6 total, 6 up, 6 in Oct 14 06:12:49 localhost nova_compute[297686]: 2025-10-14 10:12:49.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:49 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:12:49 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:12:49 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:12:49 localhost podman[325176]: 2025-10-14 10:12:49.709916214 +0000 UTC m=+0.046800409 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.819 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.819 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.828 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.828 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab11f779-cbd1-41a7-b8d8-84abb18e51ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.819617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56a8140c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.012295439, 'message_signature': 'fb4783e01a3b0c74f00fc949973689f8c20ec0dce9aa59de01ea2931612c1c7c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.819617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56a82244-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.012295439, 'message_signature': '632595d6e7611019bbded72174c6d058beb70afd6db3c29de744bb690394b23e'}]}, 'timestamp': '2025-10-14 10:12:49.829067', '_unique_id': '200ca359ef5e486ea33f0c0e6a33a0ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.830 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.831 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.834 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a7017e6-b70f-4e84-a27d-b077358608c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.831143', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56a8f6ce-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': 'ad1964c829b0f256a0b612f69fa96e8df6cdf827a953b1503c9df0ea2066735f'}]}, 'timestamp': '2025-10-14 10:12:49.834522', '_unique_id': '5483bb1e4da943afa07338738298456a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.835 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.836 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.849 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5f89b3f-ce78-4b3d-9aeb-4135006be221', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:12:49.836166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '56ab3baa-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.04165877, 'message_signature': '31b4de5e6fd67c76c3ee4afcc1f159bcdb17b3b56878135a0d4bfbe464bf47ce'}]}, 'timestamp': '2025-10-14 10:12:49.849382', '_unique_id': '647afd7cdc19477a867e9180b767194a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.850 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.867 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.867 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a05cf0c0-7625-43a0-a614-77c4760db35d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.851205', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56ae0ae2-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'd1edf4ff10efd18d6a735918dee6010acd930a3682f47dac8f85f0f9b682c3bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.851205', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56ae16ae-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': '5103a07cadc97123d2556f40f406828de37e6ca0db21bec0b1fb3c321d421746'}]}, 'timestamp': '2025-10-14 10:12:49.868079', '_unique_id': 'b3c68071c33340659d7e594418e1ec03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.869 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.869 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f6b6a2-e41a-4071-8c9d-56f35172ef66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.869481', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56ae586c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '8b9f416b142dc5d4e7733c707d4ae8511ffc653da8d83b749789379bf5601edf'}]}, 'timestamp': '2025-10-14 10:12:49.869803', '_unique_id': '65ec5c02f86340039f9786c3edb73d94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.870 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.871 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.871 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.871 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5770303-af7d-4f2f-8672-0d5daa7d3cf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.871163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56ae99bc-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': '9e2c4b4aeae3f6aeab7b6a043eb512a1ee9f63306dd5b5b0b3e44b1c3b2df29d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.871163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56aea3f8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'd06133573b5d6297eaf406f0fb332d5e082e9f4eb79b74857f514f35ad004ac3'}]}, 'timestamp': '2025-10-14 10:12:49.871711', '_unique_id': 'd5669fa5ec1344e98f403c95c999669d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.873 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.873 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62278d16-0df4-40f7-b8c0-b628fda6761b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.873074', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56aee4ee-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': '7466777b9ee2797995da1f55e09a7180c3c76f4c7d35457235eaf7038ef4b775'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.873074', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56aeef20-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'c8987d93a868a50b5fe57f8dd63d18e27a5e4643fea163d8a0b73f2e7fa0aa5f'}]}, 'timestamp': '2025-10-14 10:12:49.873615', '_unique_id': '1b09074e057b49c7bb78ade9f1f21c50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.874 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.875 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2072b03-b286-4d13-8492-2fb8e0fd5188', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.875090', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56af3372-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '7ef94f18574cbd992775037bb4c60b7aba4363b5d497780957cd3810a82dc9d7'}]}, 'timestamp': '2025-10-14 10:12:49.875385', '_unique_id': 'cea67b581f31403890e6ac1ef71ec311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.876 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14f36f8f-1848-4359-aa6f-11346281c02c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.876762', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56af7468-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.012295439, 'message_signature': '010be95b685520b96ba4f60a0b1668e973dba720a2c59fb8fc1b115420ecb645'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.876762', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56af7e86-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.012295439, 'message_signature': '18263652167a503ffd033eb7197f7ee390403fb2d88f6eb5e857578855a3081a'}]}, 'timestamp': '2025-10-14 10:12:49.877286', '_unique_id': 'e6d0243f7ffb44278a8c1534022ac4ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.877 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.878 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.878 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52777cbf-0107-49f0-8fcc-83a622cca856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.878638', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56afbebe-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '9b1daa7f3d07ac043901baecd6be4e9ff3303ce5911f1818511412a8338a9133'}]}, 'timestamp': '2025-10-14 10:12:49.878951', '_unique_id': 'e6929e42b2c8495b8644d8cbe023357a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.879 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.880 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2263fca5-2dc0-4178-87d3-5e9c9498718a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.880265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56affd48-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'cc643abb9970c362451b7e165d728781a30142e7f93f8928f76adc9d72824812'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.880265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56b00856-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'ab031fe91205fa4244e449a0d5a9058338a429a397fd2ce9886f5bcd19d71a4c'}]}, 'timestamp': '2025-10-14 10:12:49.880817', '_unique_id': 'b1fd901d70204c009b7a62214dd9a118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.882 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.882 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df197046-6eeb-457d-af24-c06123b32b73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.882184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56b0482a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': '9429033d1cbc099d2c50e28f480cb00cc6cfb7667ecbea265042e28f099533a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.882184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56b05234-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'e8e090558909fe7199eb6a599295b61d0076ecffd72e2ee823f66b08dfda0377'}]}, 'timestamp': '2025-10-14 10:12:49.882723', '_unique_id': '8d65f033cef740a7b26148ec05d0056b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4369a68c-6881-4e22-9e90-7390e225c182', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.884045', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56b09104-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '941216c17769721881b2ff950ac7b5c085592d343b261e2afe332ab0baa7a145'}]}, 'timestamp': '2025-10-14 10:12:49.884332', '_unique_id': '76fd274b4e71476683839fbf05533589'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.884 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.885 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cddfaac8-533f-4912-94c5-21fe22dd19b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.885753', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56b0d3c6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '1a68db371874170eefedab65ca387e9a0c854a8fbc27e33ec7f1d9b3dc5f05e4'}]}, 'timestamp': '2025-10-14 10:12:49.886044', '_unique_id': 'd5d1a10b5d2d494c810ffdd57f32534a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.887 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13cce040-26a9-4224-a3a2-1158c087e20f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.887446', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56b115ac-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '3f0aab8faf111f01193a1ce2b78d2a07be7ca6c6eccabbaeb600fd3893fa30c3'}]}, 'timestamp': '2025-10-14 10:12:49.887749', '_unique_id': '0d2972881a4e4206b5dba30d25058e35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.889 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e24829bc-203c-4a27-beb6-3094d2913767', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.889137', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56b15814-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': 'b1142cf68a643c221c07bfd02bd756e202e0f267c6633e61b0f5d086fc817039'}]}, 'timestamp': '2025-10-14 10:12:49.889448', '_unique_id': 'eb5c11ac3bac4673ac915921b1a7e87d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.890 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5131e12-03e7-4647-9834-dd0023e44738', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.890804', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56b19900-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': 'f43d0cc6d148fbb360ed45d51eb0fd546ca6d601bac990a59e090319eb1b01e9'}]}, 'timestamp': '2025-10-14 10:12:49.891090', '_unique_id': 'c6e2d1eca83d41b6a3b7402028036744'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.892 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '871ac68c-30f4-4bbe-a2e6-4bcc46d2bbed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:12:49.892373', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '56b1d636-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.023829193, 'message_signature': '9e1df15e44185877500cf090d4cbe6eb73a093a0887d2efa4c7ee56e676184fa'}]}, 'timestamp': '2025-10-14 10:12:49.892657', '_unique_id': 'a4eadf12d0534d5c908d5afa9a3ea815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.893 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38f365f5-d266-424f-a124-aa0fb5751806', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.893967', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56b21470-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.012295439, 'message_signature': '04b322b9743eeb53ee584c4d0104dc5aebeb08e86627b948f5f0c778e7572199'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.893967', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56b21fa6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.012295439, 'message_signature': '1f0af32c6b8c74865ad27db755d0ba74d2e1c87a64dbbb5bb73d320c15600124'}]}, 'timestamp': '2025-10-14 10:12:49.894520', '_unique_id': '07be75d6ec6b4665952990a2e6ae267b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.895 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 15630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '034a3a41-fa9c-4dbd-95c8-bdb4b3b2d94f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15630000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:12:49.895853', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '56b25e30-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.04165877, 'message_signature': '31b3d1736e07457ca91ee683cfd6ead833b1b1de57f89807f8f0e94df57d484e'}]}, 'timestamp': '2025-10-14 10:12:49.896155', '_unique_id': '4ff4c040e96c4d1486e9b686f5b00eb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c555f8d9-ab16-40c3-b518-a7cb2611c87b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:12:49.897460', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56b29cd8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'f745e0a700fd72db852df3686583fca3f3b5d1526120a9249475ad6f7c09f4e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:12:49.897460', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56b2a804-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12786.043881919, 'message_signature': 'bfa09360cd66ce4317ea2e6f19de77581839c78a9223c89fd2d689bf6b900e4b'}]}, 'timestamp': '2025-10-14 10:12:49.898009', '_unique_id': '8762a1c0f50745c8ad743b2c4bc3cbab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:12:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:12:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:12:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:12:49.986 271987 INFO neutron.agent.dhcp.agent [None req-d767f182-d9ab-4a9e-86a0-19fa0c9f6832 - - - - - -] DHCP configuration for ports {'60101be1-7a53-4653-95e9-55bbff000eb5'} is completed#033[00m Oct 14 06:12:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e109 e109: 6 total, 6 up, 6 in Oct 14 06:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:12:50 localhost podman[325196]: 2025-10-14 10:12:50.728064439 +0000 UTC m=+0.069801565 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:12:50 localhost podman[325196]: 2025-10-14 10:12:50.740099879 +0000 UTC m=+0.081837015 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:12:50 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:12:50 localhost podman[325197]: 2025-10-14 10:12:50.839465402 +0000 UTC m=+0.176340168 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:12:50 localhost podman[325197]: 2025-10-14 10:12:50.873517708 +0000 UTC m=+0.210392484 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, tcib_managed=true) Oct 14 06:12:50 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:12:51 localhost nova_compute[297686]: 2025-10-14 10:12:51.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e110 e110: 6 total, 6 up, 6 in Oct 14 06:12:52 localhost nova_compute[297686]: 2025-10-14 10:12:52.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:52 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e111 e111: 6 total, 6 up, 6 in Oct 14 06:12:53 localhost nova_compute[297686]: 2025-10-14 10:12:53.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:53 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e112 e112: 6 total, 6 up, 6 in Oct 14 06:12:53 localhost nova_compute[297686]: 2025-10-14 10:12:53.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:54 localhost nova_compute[297686]: 2025-10-14 10:12:54.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e113 e113: 6 total, 6 up, 6 in Oct 14 06:12:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:55.028 2 INFO neutron.agent.securitygroups_rpc [req-e6959410-d129-480e-a147-8e98f7e24fc8 req-7c458f69-da5c-46ee-8923-ac6067975969 c2de1fcd0fbe455e9592b601274dbbf7 c78e5db87e954cd8b794aa988dac4a81 - - default default] Security group rule updated ['782652d1-5f0f-4241-8596-761d80284e94']#033[00m Oct 14 06:12:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:55.257 2 INFO neutron.agent.securitygroups_rpc [req-f02a1572-fe5a-494f-9113-9ece324eca7c req-2fd4338c-91aa-46fe-843e-9bf1a0f505c0 dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['67264956-a547-41ed-9237-0e5135302f4b']#033[00m Oct 14 06:12:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:55.356 2 INFO neutron.agent.securitygroups_rpc [req-c1c2d624-3b37-4668-ac64-58d3f3ff9e40 req-99ad6fa3-187d-42b9-93c5-2b32fe61d110 c2de1fcd0fbe455e9592b601274dbbf7 c78e5db87e954cd8b794aa988dac4a81 - - default default] Security group rule updated ['782652d1-5f0f-4241-8596-761d80284e94']#033[00m Oct 14 06:12:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:55.684 2 INFO neutron.agent.securitygroups_rpc [req-ab0fd27c-7932-4aec-9556-2a7b53b74d18 req-9906f85f-3384-41bb-8971-24a6a1b941ed dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['bb3ad66c-6201-430c-bad3-8abc33700260']#033[00m Oct 14 06:12:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:56.383 2 INFO neutron.agent.securitygroups_rpc [req-668faa43-879e-4706-8728-13d10bbf4f34 req-37f12eb2-7aab-404b-a60d-ef075536d933 dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['24c17bff-f84d-497f-8b88-2810c752a476']#033[00m Oct 14 06:12:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e114 e114: 6 total, 6 up, 6 in Oct 14 06:12:57 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:57.135 2 INFO neutron.agent.securitygroups_rpc [req-1daaa1a3-1076-4377-ab62-c0d0300597ee req-fd1f3398-f227-4660-b3d0-f5475073bf9e dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['b785b874-262d-4aab-b7de-c29b99611a13']#033[00m Oct 14 06:12:57 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:57.310 2 INFO neutron.agent.securitygroups_rpc [None req-74f3ff50-6220-41d6-9c41-09243b4b19a0 e149b330d384449aa335bc66ff84b21a 4c1ab5e91446409bbe9b95f0f44fd3af - - default default] Security group member updated ['c19816b0-9715-42c1-a697-9db8e13e1f7e']#033[00m Oct 14 06:12:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:57.446 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:12:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:57.448 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:12:57 localhost nova_compute[297686]: 2025-10-14 10:12:57.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e115 e115: 6 total, 6 up, 6 in Oct 14 06:12:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:57.782 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:12:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:57.783 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:12:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:12:57.783 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:12:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:58.043 2 INFO neutron.agent.securitygroups_rpc [req-b57563d1-e133-4056-b375-dfcfffb3c465 req-ba99f2c8-dfe0-40ed-b8ed-2897051ad36a dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['7345ec05-011e-4ded-86dc-94bcbcf0917f']#033[00m Oct 14 06:12:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:12:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:12:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:12:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:58.111 2 INFO neutron.agent.securitygroups_rpc [None req-5e1e206d-69c6-4a29-b858-d7026e93ccfa e149b330d384449aa335bc66ff84b21a 4c1ab5e91446409bbe9b95f0f44fd3af - - default default] Security group member updated ['c19816b0-9715-42c1-a697-9db8e13e1f7e']#033[00m Oct 14 06:12:58 localhost systemd[1]: tmp-crun.ZiAWYy.mount: Deactivated successfully. Oct 14 06:12:58 localhost podman[325237]: 2025-10-14 10:12:58.135737847 +0000 UTC m=+0.066095971 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:12:58 localhost podman[325237]: 2025-10-14 10:12:58.143945259 +0000 UTC m=+0.074303423 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:12:58 localhost podman[325238]: 2025-10-14 10:12:58.152523292 +0000 UTC m=+0.077058477 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:12:58 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:12:58 localhost podman[325238]: 2025-10-14 10:12:58.162917112 +0000 UTC m=+0.087452277 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:12:58 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:12:58 localhost podman[325244]: 2025-10-14 10:12:58.201525568 +0000 UTC m=+0.120671758 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:12:58 localhost podman[325244]: 2025-10-14 10:12:58.213209427 +0000 UTC m=+0.132355617 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:12:58 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:12:58 localhost podman[248187]: time="2025-10-14T10:12:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:12:58 localhost podman[248187]: @ - - [14/Oct/2025:10:12:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:12:58 localhost podman[248187]: @ - - [14/Oct/2025:10:12:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19857 "" "Go-http-client/1.1" Oct 14 06:12:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:58.542 2 INFO neutron.agent.securitygroups_rpc [req-c4643eec-66e8-42c5-8b11-066dcb7f40df req-5fb78f31-841d-432b-ae05-9090fdba3779 dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['7345ec05-011e-4ded-86dc-94bcbcf0917f']#033[00m Oct 14 06:12:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e116 e116: 6 total, 6 up, 6 in Oct 14 06:12:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:58.853 2 INFO neutron.agent.securitygroups_rpc [None req-a7da228c-ba85-4de2-88a7-e29995d36c73 e149b330d384449aa335bc66ff84b21a 4c1ab5e91446409bbe9b95f0f44fd3af - - default default] Security group member updated ['c19816b0-9715-42c1-a697-9db8e13e1f7e']#033[00m Oct 14 06:12:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:12:58.946 2 INFO neutron.agent.securitygroups_rpc [req-83ebe009-585c-406e-8ba0-dd94c07e5ac0 req-514f789a-26da-41eb-9b01-d2407de9e82f dcb5a2297cd24cb99021d3afeeb30262 13c2d838c66c4141a3a77483b40ab737 - - default default] Security group rule updated ['7345ec05-011e-4ded-86dc-94bcbcf0917f']#033[00m Oct 14 06:12:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:12:59 localhost nova_compute[297686]: 2025-10-14 10:12:59.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:12:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e117 e117: 6 total, 6 up, 6 in Oct 14 06:13:00 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:00.327 2 INFO neutron.agent.securitygroups_rpc [None req-a3d8b536-e13a-4da7-87d9-1b956ede87d9 e149b330d384449aa335bc66ff84b21a 4c1ab5e91446409bbe9b95f0f44fd3af - - default default] Security group member updated ['c19816b0-9715-42c1-a697-9db8e13e1f7e']#033[00m Oct 14 06:13:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e118 e118: 6 total, 6 up, 6 in Oct 14 06:13:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e119 e119: 6 total, 6 up, 6 in Oct 14 06:13:02 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:02 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:02 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:02 localhost podman[325315]: 2025-10-14 10:13:02.380792142 +0000 UTC m=+0.051131962 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:13:02 localhost ovn_controller[157396]: 2025-10-14T10:13:02Z|00101|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:02 localhost nova_compute[297686]: 2025-10-14 10:13:02.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:03 localhost ovn_controller[157396]: 2025-10-14T10:13:03Z|00102|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:03 localhost nova_compute[297686]: 2025-10-14 10:13:03.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:03 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:03 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:03 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:03 localhost podman[325351]: 2025-10-14 10:13:03.595763634 +0000 UTC m=+0.072365194 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:13:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e120 e120: 6 total, 6 up, 6 in Oct 14 06:13:04 localhost nova_compute[297686]: 2025-10-14 10:13:04.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:05.451 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:13:07 localhost sshd[325373]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:13:07 localhost nova_compute[297686]: 2025-10-14 10:13:07.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:13:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:13:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:13:08 localhost systemd[1]: tmp-crun.B1LgW8.mount: Deactivated successfully. Oct 14 06:13:08 localhost podman[325375]: 2025-10-14 10:13:08.751999711 +0000 UTC m=+0.088624303 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3) Oct 14 06:13:08 localhost openstack_network_exporter[250374]: ERROR 10:13:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:13:08 localhost openstack_network_exporter[250374]: ERROR 10:13:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:13:08 localhost openstack_network_exporter[250374]: ERROR 10:13:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:13:08 localhost openstack_network_exporter[250374]: ERROR 10:13:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:13:08 localhost openstack_network_exporter[250374]: Oct 14 06:13:08 localhost openstack_network_exporter[250374]: ERROR 10:13:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:13:08 localhost openstack_network_exporter[250374]: Oct 14 06:13:08 localhost podman[325376]: 2025-10-14 10:13:08.816099751 +0000 UTC m=+0.151489515 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.) Oct 14 06:13:08 localhost podman[325377]: 2025-10-14 10:13:08.856743349 +0000 UTC m=+0.185939143 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:13:08 localhost podman[325377]: 2025-10-14 10:13:08.866635783 +0000 UTC m=+0.195831567 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:13:08 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:13:08 localhost podman[325376]: 2025-10-14 10:13:08.881449058 +0000 UTC m=+0.216838832 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:13:08 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:13:08 localhost podman[325375]: 2025-10-14 10:13:08.932244938 +0000 UTC m=+0.268869520 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_controller) Oct 14 06:13:08 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:13:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e121 e121: 6 total, 6 up, 6 in Oct 14 06:13:09 localhost nova_compute[297686]: 2025-10-14 10:13:09.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:11 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:11.023 271987 INFO neutron.agent.linux.ip_lib [None req-8537f1e6-f585-4d62-87c5-8739f01ebe11 - - - - - -] Device tapdee085e1-01 cannot be used as it has no MAC address#033[00m Oct 14 06:13:11 localhost nova_compute[297686]: 2025-10-14 10:13:11.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:11 localhost kernel: device tapdee085e1-01 entered promiscuous mode Oct 14 06:13:11 localhost NetworkManager[5977]: [1760436791.0531] manager: (tapdee085e1-01): new Generic device (/org/freedesktop/NetworkManager/Devices/23) Oct 14 06:13:11 localhost ovn_controller[157396]: 2025-10-14T10:13:11Z|00103|binding|INFO|Claiming lport dee085e1-010e-4e7c-944a-6ae3934bccb7 for this chassis. Oct 14 06:13:11 localhost ovn_controller[157396]: 2025-10-14T10:13:11Z|00104|binding|INFO|dee085e1-010e-4e7c-944a-6ae3934bccb7: Claiming unknown Oct 14 06:13:11 localhost nova_compute[297686]: 2025-10-14 10:13:11.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:11 localhost systemd-udevd[325448]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:13:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:11.071 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-5105abb4-2220-4dcf-9429-73af964fcfcf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5105abb4-2220-4dcf-9429-73af964fcfcf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98b0e20a851d4229a03e25233b4b19d1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b5b491d-49e0-4ac2-b12e-aa7584b2aa89, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dee085e1-010e-4e7c-944a-6ae3934bccb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:11.072 163055 INFO neutron.agent.ovn.metadata.agent [-] Port dee085e1-010e-4e7c-944a-6ae3934bccb7 in datapath 5105abb4-2220-4dcf-9429-73af964fcfcf bound to our chassis#033[00m Oct 14 06:13:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:11.074 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5105abb4-2220-4dcf-9429-73af964fcfcf or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:13:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:11.074 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[e11d6be1-c5dc-4efe-b974-d4929a0289d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost ovn_controller[157396]: 2025-10-14T10:13:11Z|00105|binding|INFO|Setting lport dee085e1-010e-4e7c-944a-6ae3934bccb7 ovn-installed in OVS Oct 14 06:13:11 localhost ovn_controller[157396]: 2025-10-14T10:13:11Z|00106|binding|INFO|Setting lport dee085e1-010e-4e7c-944a-6ae3934bccb7 up in Southbound Oct 14 06:13:11 localhost nova_compute[297686]: 2025-10-14 10:13:11.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost journal[237477]: ethtool ioctl error on tapdee085e1-01: No such device Oct 14 06:13:11 localhost nova_compute[297686]: 2025-10-14 10:13:11.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:11 localhost nova_compute[297686]: 2025-10-14 10:13:11.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:11 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:11.853 2 INFO neutron.agent.securitygroups_rpc [req-f448f9a7-e24d-43dd-bad6-e86239ce79f3 req-462ba4ec-ef16-49c8-b69f-f514a5e8e6e3 a5c8b032521c4660a9f50471da931c3a 67facb686b1a45e4af5a7329836978ce - - default default] Security group member updated ['c2c1552c-9248-46c1-8391-9c390debaa3c']#033[00m Oct 14 06:13:11 localhost podman[325519]: Oct 14 06:13:11 localhost podman[325519]: 2025-10-14 10:13:11.90466067 +0000 UTC m=+0.089151170 container create b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:13:11 localhost systemd[1]: Started libpod-conmon-b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9.scope. Oct 14 06:13:11 localhost podman[325519]: 2025-10-14 10:13:11.859542834 +0000 UTC m=+0.044033404 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:13:11 localhost systemd[1]: Started libcrun container. Oct 14 06:13:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d5c80661321bdf5f0aa2987b744edc467c8894c3e72d3dfa565600fc5f08c39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:13:11 localhost podman[325519]: 2025-10-14 10:13:11.990152086 +0000 UTC m=+0.174642586 container init b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:13:12 localhost systemd[1]: tmp-crun.Gv6XQY.mount: Deactivated successfully. Oct 14 06:13:12 localhost podman[325519]: 2025-10-14 10:13:12.001430893 +0000 UTC m=+0.185921393 container start b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:13:12 localhost dnsmasq[325537]: started, version 2.85 cachesize 150 Oct 14 06:13:12 localhost dnsmasq[325537]: DNS service limited to local subnets Oct 14 06:13:12 localhost dnsmasq[325537]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:13:12 localhost dnsmasq[325537]: warning: no upstream servers configured Oct 14 06:13:12 localhost dnsmasq-dhcp[325537]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:13:12 localhost dnsmasq[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/addn_hosts - 0 addresses Oct 14 06:13:12 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/host Oct 14 06:13:12 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/opts Oct 14 06:13:12 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:12.254 271987 INFO neutron.agent.dhcp.agent [None req-20248d08-11be-4966-b20f-7b21826c5e7f - - - - - -] DHCP configuration for ports {'e8c48610-b85a-4d2e-9dc3-8628e5f58e6e'} is completed#033[00m Oct 14 06:13:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e122 e122: 6 total, 6 up, 6 in Oct 14 06:13:12 localhost nova_compute[297686]: 2025-10-14 10:13:12.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:13.286 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:12Z, description=, device_id=49fbd0f5-475f-4594-9e77-7ed11ceb655b, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=438a787a-6d89-44af-b1ac-67ef2d230541, ip_allocation=immediate, mac_address=fa:16:3e:86:f3:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1042, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:13Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:13 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:13.397 2 INFO neutron.agent.securitygroups_rpc [None req-b162a273-2c5c-4b82-977d-185565862f6c e654b0e5afc74f6c8660c559a7d225d2 a2d4f9e7e0df4c00a4b53d184050c204 - - default default] Security group member updated ['e76cc9ce-8b06-463e-9791-181ca08926cd']#033[00m Oct 14 06:13:13 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:13 localhost systemd[1]: tmp-crun.9S98NV.mount: Deactivated successfully. Oct 14 06:13:13 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:13 localhost podman[325555]: 2025-10-14 10:13:13.483538092 +0000 UTC m=+0.052542245 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:13:13 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:13.707 271987 INFO neutron.agent.dhcp.agent [None req-b9316a26-de22-46e7-b7c9-06ef7d897ca2 - - - - - -] DHCP configuration for ports {'438a787a-6d89-44af-b1ac-67ef2d230541'} is completed#033[00m Oct 14 06:13:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:14.153 2 INFO neutron.agent.securitygroups_rpc [None req-424a51ba-7455-4cd3-8a57-ddbc38b4f9ef e654b0e5afc74f6c8660c559a7d225d2 a2d4f9e7e0df4c00a4b53d184050c204 - - default default] Security group member updated ['e76cc9ce-8b06-463e-9791-181ca08926cd']#033[00m Oct 14 06:13:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:14.436 271987 INFO neutron.agent.linux.ip_lib [None req-23433998-60fb-4abc-9db5-142332339af1 - - - - - -] Device tap54476a46-32 cannot be used as it has no MAC address#033[00m Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost kernel: device tap54476a46-32 entered promiscuous mode Oct 14 06:13:14 localhost NetworkManager[5977]: [1760436794.4809] manager: (tap54476a46-32): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00107|binding|INFO|Claiming lport 54476a46-32a2-4a2a-aa3e-5703e42659b5 for this chassis. Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00108|binding|INFO|54476a46-32a2-4a2a-aa3e-5703e42659b5: Claiming unknown Oct 14 06:13:14 localhost systemd-udevd[325587]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00109|binding|INFO|Releasing lport a4b30293-434d-4d8b-b6ad-840c82777955 from this chassis (sb_readonly=1) Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00110|if_status|INFO|Not setting lport a4b30293-434d-4d8b-b6ad-840c82777955 down as sb is readonly Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00111|binding|INFO|Setting lport a4b30293-434d-4d8b-b6ad-840c82777955 down in Southbound Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.491 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2e89b669-126b-40c1-acd1-6f67bd63ad7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e89b669-126b-40c1-acd1-6f67bd63ad7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aadbca62f85049bbb5689b00ddbce91d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=020c7cea-6d1a-46a8-a024-3c5fb878cafa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=54476a46-32a2-4a2a-aa3e-5703e42659b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.492 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 54476a46-32a2-4a2a-aa3e-5703e42659b5 in datapath 2e89b669-126b-40c1-acd1-6f67bd63ad7c bound to our chassis#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.494 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2e89b669-126b-40c1-acd1-6f67bd63ad7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.494 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[55b58f1b-9058-415a-9c36-a361680b946f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.499 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-c0145816-4627-44f2-af00-ccc9ef0436ed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0145816-4627-44f2-af00-ccc9ef0436ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4a79b2d-2081-4037-8963-a49d853ec2ea, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=a4b30293-434d-4d8b-b6ad-840c82777955) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.500 163055 INFO neutron.agent.ovn.metadata.agent [-] Port a4b30293-434d-4d8b-b6ad-840c82777955 in datapath c0145816-4627-44f2-af00-ccc9ef0436ed unbound from our chassis#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.503 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0145816-4627-44f2-af00-ccc9ef0436ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:13:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:14.504 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[752a24f2-b076-4b7f-a1cc-ad254ee5b06a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost kernel: device tapa4b30293-43 left promiscuous mode Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00112|binding|INFO|Setting lport 54476a46-32a2-4a2a-aa3e-5703e42659b5 ovn-installed in OVS Oct 14 06:13:14 localhost ovn_controller[157396]: 2025-10-14T10:13:14Z|00113|binding|INFO|Setting lport 54476a46-32a2-4a2a-aa3e-5703e42659b5 up in Southbound Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost journal[237477]: ethtool ioctl error on tap54476a46-32: No such device Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:14 localhost nova_compute[297686]: 2025-10-14 10:13:14.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:15 localhost podman[325658]: Oct 14 06:13:15 localhost podman[325658]: 2025-10-14 10:13:15.409854597 +0000 UTC m=+0.098855048 container create 6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e89b669-126b-40c1-acd1-6f67bd63ad7c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:13:15 localhost systemd[1]: Started libpod-conmon-6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5.scope. Oct 14 06:13:15 localhost podman[325658]: 2025-10-14 10:13:15.365247698 +0000 UTC m=+0.054248189 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:13:15 localhost systemd[1]: Started libcrun container. Oct 14 06:13:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f4a1dfb2ee24ebbf2d6109954e4b68135569f88e8788b26d12cfad4d5c14907/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:13:15 localhost podman[325658]: 2025-10-14 10:13:15.507712393 +0000 UTC m=+0.196712834 container init 6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e89b669-126b-40c1-acd1-6f67bd63ad7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:13:15 localhost podman[325658]: 2025-10-14 10:13:15.549486657 +0000 UTC m=+0.238487108 container start 6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e89b669-126b-40c1-acd1-6f67bd63ad7c, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:13:15 localhost dnsmasq[325676]: started, version 2.85 cachesize 150 Oct 14 06:13:15 localhost dnsmasq[325676]: DNS service limited to local subnets Oct 14 06:13:15 localhost dnsmasq[325676]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:13:15 localhost dnsmasq[325676]: warning: no upstream servers configured Oct 14 06:13:15 localhost dnsmasq-dhcp[325676]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:13:15 localhost dnsmasq[325676]: read /var/lib/neutron/dhcp/2e89b669-126b-40c1-acd1-6f67bd63ad7c/addn_hosts - 0 addresses Oct 14 06:13:15 localhost dnsmasq-dhcp[325676]: read /var/lib/neutron/dhcp/2e89b669-126b-40c1-acd1-6f67bd63ad7c/host Oct 14 06:13:15 localhost dnsmasq-dhcp[325676]: read /var/lib/neutron/dhcp/2e89b669-126b-40c1-acd1-6f67bd63ad7c/opts Oct 14 06:13:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:15.651 271987 INFO neutron.agent.dhcp.agent [None req-d0649544-52eb-4675-a61e-8fe859e8f3b1 - - - - - -] DHCP configuration for ports {'8a32c87c-e62d-40d8-96ae-fbf3883e8170'} is completed#033[00m Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.146 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:15Z, description=, device_id=a63064e1-a9dd-4084-806d-4aae647ddd4a, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8ba38fef-8b1e-43e1-8142-83dca184daf5, ip_allocation=immediate, mac_address=fa:16:3e:17:d1:73, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1050, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:15Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:16 localhost dnsmasq[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:13:16 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:16 localhost podman[325692]: 2025-10-14 10:13:16.370764686 +0000 UTC m=+0.060515520 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:13:16 localhost dnsmasq-dhcp[272303]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent [None req-9981d268-e390-4266-98ae-65978595a31c - - - - - -] Unable to reload_allocations dhcp for c0145816-4627-44f2-af00-ccc9ef0436ed.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa4b30293-43 not found in namespace qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed. Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent return fut.result() Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent return self.__get_result() Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent raise self._exception Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa4b30293-43 not found in namespace qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed. Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.400 271987 ERROR neutron.agent.dhcp.agent #033[00m Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.407 271987 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.531 271987 INFO neutron.agent.dhcp.agent [None req-867c68e0-0c94-4a47-9faa-f2c18b2c732f - - - - - -] DHCP configuration for ports {'8ba38fef-8b1e-43e1-8142-83dca184daf5'} is completed#033[00m Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.616 271987 INFO neutron.agent.dhcp.agent [None req-2a0051eb-f11e-4e7f-bbd5-8b3e93a7ec2e - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:13:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:16.618 271987 INFO neutron.agent.dhcp.agent [-] Starting network c0145816-4627-44f2-af00-ccc9ef0436ed dhcp configuration#033[00m Oct 14 06:13:16 localhost dnsmasq[272303]: exiting on receipt of SIGTERM Oct 14 06:13:16 localhost podman[325724]: 2025-10-14 10:13:16.816634023 +0000 UTC m=+0.069540967 container kill 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:13:16 localhost systemd[1]: libpod-373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964.scope: Deactivated successfully. Oct 14 06:13:16 localhost podman[325738]: 2025-10-14 10:13:16.891291086 +0000 UTC m=+0.058387365 container died 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:13:16 localhost podman[325738]: 2025-10-14 10:13:16.927482728 +0000 UTC m=+0.094579017 container cleanup 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 06:13:16 localhost systemd[1]: libpod-conmon-373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964.scope: Deactivated successfully. Oct 14 06:13:16 localhost podman[325739]: 2025-10-14 10:13:16.976251526 +0000 UTC m=+0.139174396 container remove 373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:13:17 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:17.024 271987 INFO neutron.agent.linux.ip_lib [-] Device tapa4b30293-43 cannot be used as it has no MAC address#033[00m Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost kernel: device tapa4b30293-43 entered promiscuous mode Oct 14 06:13:17 localhost NetworkManager[5977]: [1760436797.0966] manager: (tapa4b30293-43): new Generic device (/org/freedesktop/NetworkManager/Devices/25) Oct 14 06:13:17 localhost ovn_controller[157396]: 2025-10-14T10:13:17Z|00114|binding|INFO|Claiming lport a4b30293-434d-4d8b-b6ad-840c82777955 for this chassis. Oct 14 06:13:17 localhost ovn_controller[157396]: 2025-10-14T10:13:17Z|00115|binding|INFO|a4b30293-434d-4d8b-b6ad-840c82777955: Claiming unknown Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:17.105 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-c0145816-4627-44f2-af00-ccc9ef0436ed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0145816-4627-44f2-af00-ccc9ef0436ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41187b090f3d4818a32baa37ce8a3991', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4a79b2d-2081-4037-8963-a49d853ec2ea, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=a4b30293-434d-4d8b-b6ad-840c82777955) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:17 localhost ovn_controller[157396]: 2025-10-14T10:13:17Z|00116|binding|INFO|Setting lport a4b30293-434d-4d8b-b6ad-840c82777955 ovn-installed in OVS Oct 14 06:13:17 localhost ovn_controller[157396]: 2025-10-14T10:13:17Z|00117|binding|INFO|Setting lport a4b30293-434d-4d8b-b6ad-840c82777955 up in Southbound Oct 14 06:13:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:17.107 163055 INFO neutron.agent.ovn.metadata.agent [-] Port a4b30293-434d-4d8b-b6ad-840c82777955 in datapath c0145816-4627-44f2-af00-ccc9ef0436ed bound to our chassis#033[00m Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:17.116 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 87ba2fb9-204a-489c-af46-1632ef587df4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:13:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:17.116 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0145816-4627-44f2-af00-ccc9ef0436ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:13:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:17.117 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[b134759d-14ae-459d-ac50-e76952f0e8ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:17 localhost systemd[1]: var-lib-containers-storage-overlay-b2a8e08ec2da19ae228015c891a79ab4164456cccb64d0ef1eafa1149b2a4dca-merged.mount: Deactivated successfully. Oct 14 06:13:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-373c3d29f2bfadc43d54cb6930a55bb623f8f4f5957bebc53e3e7158ee295964-userdata-shm.mount: Deactivated successfully. Oct 14 06:13:17 localhost nova_compute[297686]: 2025-10-14 10:13:17.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:18 localhost podman[325819]: Oct 14 06:13:18 localhost podman[325819]: 2025-10-14 10:13:18.012975934 +0000 UTC m=+0.090989107 container create 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Oct 14 06:13:18 localhost systemd[1]: Started libpod-conmon-27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3.scope. Oct 14 06:13:18 localhost podman[325819]: 2025-10-14 10:13:17.969380364 +0000 UTC m=+0.047393547 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:13:18 localhost systemd[1]: Started libcrun container. Oct 14 06:13:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12d496126b87e0c1fc7393c34a8dd7a8ab14d98e754a7460c6f3bb3a7ab8388b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:13:18 localhost podman[325819]: 2025-10-14 10:13:18.102981769 +0000 UTC m=+0.180994922 container init 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Oct 14 06:13:18 localhost podman[325819]: 2025-10-14 10:13:18.11506884 +0000 UTC m=+0.193081993 container start 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:13:18 localhost dnsmasq[325837]: started, version 2.85 cachesize 150 Oct 14 06:13:18 localhost dnsmasq[325837]: DNS service limited to local subnets Oct 14 06:13:18 localhost dnsmasq[325837]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:13:18 localhost dnsmasq[325837]: warning: no upstream servers configured Oct 14 06:13:18 localhost dnsmasq-dhcp[325837]: DHCP, static leases only on 192.168.122.0, lease time 1d Oct 14 06:13:18 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:13:18 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:18 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:18.175 271987 INFO neutron.agent.dhcp.agent [None req-fe06890b-74a3-4e3b-925f-fd8ce0a4e2d4 - - - - - -] Finished network c0145816-4627-44f2-af00-ccc9ef0436ed dhcp configuration#033[00m Oct 14 06:13:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:18.176 271987 INFO neutron.agent.dhcp.agent [None req-2a0051eb-f11e-4e7f-bbd5-8b3e93a7ec2e - - - - - -] Synchronizing state complete#033[00m Oct 14 06:13:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:18.177 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:16Z, description=, device_id=49fbd0f5-475f-4594-9e77-7ed11ceb655b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=920a4e04-d60f-4bc9-848e-835fee97bfa5, ip_allocation=immediate, mac_address=fa:16:3e:b2:25:aa, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:13:08Z, description=, dns_domain=, id=5105abb4-2220-4dcf-9429-73af964fcfcf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-2093914441-network, port_security_enabled=True, project_id=98b0e20a851d4229a03e25233b4b19d1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54560, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1006, status=ACTIVE, subnets=['1169e8f2-68b5-424c-b9f3-c03e2c6b5bba'], tags=[], tenant_id=98b0e20a851d4229a03e25233b4b19d1, updated_at=2025-10-14T10:13:09Z, vlan_transparent=None, network_id=5105abb4-2220-4dcf-9429-73af964fcfcf, port_security_enabled=False, project_id=98b0e20a851d4229a03e25233b4b19d1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1051, status=DOWN, tags=[], tenant_id=98b0e20a851d4229a03e25233b4b19d1, updated_at=2025-10-14T10:13:16Z on network 5105abb4-2220-4dcf-9429-73af964fcfcf#033[00m Oct 14 06:13:18 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e123 e123: 6 total, 6 up, 6 in Oct 14 06:13:18 localhost dnsmasq[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/addn_hosts - 1 addresses Oct 14 06:13:18 localhost podman[325855]: 2025-10-14 10:13:18.403584363 +0000 UTC m=+0.059964803 container kill b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:18 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/host Oct 14 06:13:18 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/opts Oct 14 06:13:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:18.462 271987 INFO neutron.agent.dhcp.agent [None req-691dd217-f78c-46ed-961d-9f8d23d78e6c - - - - - -] DHCP configuration for ports {'c5061e05-fbdf-4d81-b1d8-4bfaaa73263c', '608d193e-b40e-42ce-95e2-532a86f20043', 'a4b30293-434d-4d8b-b6ad-840c82777955', 'c8a1e507-d02b-46f2-ba97-01ab899e151c', '438a787a-6d89-44af-b1ac-67ef2d230541', '62f47f8a-76e6-4e1f-aab4-b3ec4b9f5cf9', '8ba38fef-8b1e-43e1-8142-83dca184daf5'} is completed#033[00m Oct 14 06:13:18 localhost nova_compute[297686]: 2025-10-14 10:13:18.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:18.706 271987 INFO neutron.agent.dhcp.agent [None req-ac946ea1-fdc2-45e7-b6c6-ff5125d1581d - - - - - -] DHCP configuration for ports {'920a4e04-d60f-4bc9-848e-835fee97bfa5'} is completed#033[00m Oct 14 06:13:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:18.754 271987 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpx__tb9w0/privsep.sock']#033[00m Oct 14 06:13:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e124 e124: 6 total, 6 up, 6 in Oct 14 06:13:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:19 localhost ovn_controller[157396]: 2025-10-14T10:13:19Z|00118|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:19 localhost nova_compute[297686]: 2025-10-14 10:13:19.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:19.389 271987 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Oct 14 06:13:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:19.276 325880 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Oct 14 06:13:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:19.281 325880 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Oct 14 06:13:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:19.284 325880 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Oct 14 06:13:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:19.284 325880 INFO oslo.privsep.daemon [-] privsep daemon running as pid 325880#033[00m Oct 14 06:13:19 localhost dnsmasq-dhcp[325837]: DHCPRELEASE(tapa4b30293-43) 192.168.122.176 fa:16:3e:b5:07:c5 Oct 14 06:13:19 localhost nova_compute[297686]: 2025-10-14 10:13:19.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:19.993 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:16Z, description=, device_id=49fbd0f5-475f-4594-9e77-7ed11ceb655b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=920a4e04-d60f-4bc9-848e-835fee97bfa5, ip_allocation=immediate, mac_address=fa:16:3e:b2:25:aa, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:13:08Z, description=, dns_domain=, id=5105abb4-2220-4dcf-9429-73af964fcfcf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-2093914441-network, port_security_enabled=True, project_id=98b0e20a851d4229a03e25233b4b19d1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54560, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1006, status=ACTIVE, subnets=['1169e8f2-68b5-424c-b9f3-c03e2c6b5bba'], tags=[], tenant_id=98b0e20a851d4229a03e25233b4b19d1, updated_at=2025-10-14T10:13:09Z, vlan_transparent=None, network_id=5105abb4-2220-4dcf-9429-73af964fcfcf, port_security_enabled=False, project_id=98b0e20a851d4229a03e25233b4b19d1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1051, status=DOWN, tags=[], tenant_id=98b0e20a851d4229a03e25233b4b19d1, updated_at=2025-10-14T10:13:16Z on network 5105abb4-2220-4dcf-9429-73af964fcfcf#033[00m Oct 14 06:13:20 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:20 localhost podman[325916]: 2025-10-14 10:13:20.180639012 +0000 UTC m=+0.036379658 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:13:20 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:20 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:20 localhost dnsmasq[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/addn_hosts - 1 addresses Oct 14 06:13:20 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/host Oct 14 06:13:20 localhost podman[325918]: 2025-10-14 10:13:20.243444511 +0000 UTC m=+0.094602987 container kill b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:13:20 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/opts Oct 14 06:13:20 localhost systemd[1]: tmp-crun.L8wrd6.mount: Deactivated successfully. Oct 14 06:13:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:20.453 271987 INFO neutron.agent.dhcp.agent [None req-aedb0677-4326-47b8-88ae-bb4f69796449 - - - - - -] DHCP configuration for ports {'920a4e04-d60f-4bc9-848e-835fee97bfa5'} is completed#033[00m Oct 14 06:13:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e125 e125: 6 total, 6 up, 6 in Oct 14 06:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:13:21 localhost podman[325980]: 2025-10-14 10:13:21.287581497 +0000 UTC m=+0.088530371 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:13:21 localhost podman[325980]: 2025-10-14 10:13:21.300120352 +0000 UTC m=+0.101069216 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:13:21 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:13:21 localhost podman[325981]: 2025-10-14 10:13:21.34465986 +0000 UTC m=+0.144307333 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:13:21 localhost podman[325981]: 2025-10-14 10:13:21.378073597 +0000 UTC m=+0.177721010 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 06:13:21 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:13:22 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:13:22 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:13:22 localhost nova_compute[297686]: 2025-10-14 10:13:22.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:24.244 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:23Z, description=, device_id=4369264a-030a-48a3-848d-dcf7553c1988, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5530b0f1-b185-441f-8939-34b7fcd71ffe, ip_allocation=immediate, mac_address=fa:16:3e:ac:97:d1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1086, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:23Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:24 localhost nova_compute[297686]: 2025-10-14 10:13:24.252 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:24 localhost nova_compute[297686]: 2025-10-14 10:13:24.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:13:24 localhost podman[326103]: 2025-10-14 10:13:24.468496552 +0000 UTC m=+0.056663072 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:13:24 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:13:24 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:24 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e126 e126: 6 total, 6 up, 6 in Oct 14 06:13:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:24.755 271987 INFO neutron.agent.dhcp.agent [None req-3977f523-a97a-42cb-9db9-a2df80e8a5b0 - - - - - -] DHCP configuration for ports {'5530b0f1-b185-441f-8939-34b7fcd71ffe'} is completed#033[00m Oct 14 06:13:24 localhost nova_compute[297686]: 2025-10-14 10:13:24.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:24 localhost dnsmasq[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/addn_hosts - 0 addresses Oct 14 06:13:24 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/host Oct 14 06:13:24 localhost dnsmasq-dhcp[325537]: read /var/lib/neutron/dhcp/5105abb4-2220-4dcf-9429-73af964fcfcf/opts Oct 14 06:13:24 localhost podman[326140]: 2025-10-14 10:13:24.837434796 +0000 UTC m=+0.063764440 container kill b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Oct 14 06:13:24 localhost nova_compute[297686]: 2025-10-14 10:13:24.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:25 localhost ovn_controller[157396]: 2025-10-14T10:13:25Z|00119|binding|INFO|Releasing lport dee085e1-010e-4e7c-944a-6ae3934bccb7 from this chassis (sb_readonly=0) Oct 14 06:13:25 localhost kernel: device tapdee085e1-01 left promiscuous mode Oct 14 06:13:25 localhost ovn_controller[157396]: 2025-10-14T10:13:25Z|00120|binding|INFO|Setting lport dee085e1-010e-4e7c-944a-6ae3934bccb7 down in Southbound Oct 14 06:13:25 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:25.047 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-5105abb4-2220-4dcf-9429-73af964fcfcf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5105abb4-2220-4dcf-9429-73af964fcfcf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98b0e20a851d4229a03e25233b4b19d1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b5b491d-49e0-4ac2-b12e-aa7584b2aa89, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dee085e1-010e-4e7c-944a-6ae3934bccb7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:25 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:25.049 163055 INFO neutron.agent.ovn.metadata.agent [-] Port dee085e1-010e-4e7c-944a-6ae3934bccb7 in datapath 5105abb4-2220-4dcf-9429-73af964fcfcf unbound from our chassis#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:25 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:25.053 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5105abb4-2220-4dcf-9429-73af964fcfcf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:13:25 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:25.054 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c88c5f-7611-4d12-93b1-7270ace66248]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.375 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.376 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.376 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:13:25 localhost nova_compute[297686]: 2025-10-14 10:13:25.376 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:13:26 localhost nova_compute[297686]: 2025-10-14 10:13:26.462 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:13:26 localhost nova_compute[297686]: 2025-10-14 10:13:26.486 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:13:26 localhost nova_compute[297686]: 2025-10-14 10:13:26.487 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:13:26 localhost nova_compute[297686]: 2025-10-14 10:13:26.488 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:26 localhost podman[326178]: 2025-10-14 10:13:26.704594134 +0000 UTC m=+0.055604949 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:13:26 localhost systemd[1]: tmp-crun.AosOy9.mount: Deactivated successfully. Oct 14 06:13:26 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:26 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:26 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:27 localhost nova_compute[297686]: 2025-10-14 10:13:27.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:27 localhost nova_compute[297686]: 2025-10-14 10:13:27.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:13:27 localhost dnsmasq-dhcp[325837]: DHCPRELEASE(tapa4b30293-43) 192.168.122.178 fa:16:3e:86:f3:77 Oct 14 06:13:27 localhost ovn_controller[157396]: 2025-10-14T10:13:27Z|00121|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:27 localhost nova_compute[297686]: 2025-10-14 10:13:27.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:28 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:28 localhost podman[326241]: 2025-10-14 10:13:28.102265131 +0000 UTC m=+0.084630921 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:28 localhost dnsmasq[325537]: exiting on receipt of SIGTERM Oct 14 06:13:28 localhost podman[326273]: 2025-10-14 10:13:28.14910901 +0000 UTC m=+0.063561434 container kill b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:13:28 localhost systemd[1]: libpod-b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9.scope: Deactivated successfully. Oct 14 06:13:28 localhost podman[326259]: 2025-10-14 10:13:28.207963018 +0000 UTC m=+0.154702133 container kill 6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e89b669-126b-40c1-acd1-6f67bd63ad7c, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:13:28 localhost dnsmasq[325676]: exiting on receipt of SIGTERM Oct 14 06:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:13:28 localhost systemd[1]: libpod-6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5.scope: Deactivated successfully. Oct 14 06:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:13:28 localhost podman[326294]: 2025-10-14 10:13:28.235865895 +0000 UTC m=+0.071708654 container died b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:13:28 localhost nova_compute[297686]: 2025-10-14 10:13:28.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:28.269 271987 INFO neutron.agent.dhcp.agent [None req-6b71f290-d9ef-4b01-bba7-31ea11cd48f4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:27Z, description=, device_id=3ec8c604-bb33-4345-822a-c4a3b433799a, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b3970207-a24f-46b2-98ad-8106454387d4, ip_allocation=immediate, mac_address=fa:16:3e:55:f5:60, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1103, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:27Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:28 localhost podman[248187]: time="2025-10-14T10:13:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:13:28 localhost podman[326324]: 2025-10-14 10:13:28.325756696 +0000 UTC m=+0.094662109 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:13:28 localhost podman[326294]: 2025-10-14 10:13:28.336258259 +0000 UTC m=+0.172100998 container cleanup b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:28 localhost systemd[1]: libpod-conmon-b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9.scope: Deactivated successfully. Oct 14 06:13:28 localhost podman[326332]: 2025-10-14 10:13:28.362368931 +0000 UTC m=+0.110970550 container died 6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e89b669-126b-40c1-acd1-6f67bd63ad7c, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:13:28 localhost podman[326296]: 2025-10-14 10:13:28.408168328 +0000 UTC m=+0.237682952 container remove b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5105abb4-2220-4dcf-9429-73af964fcfcf, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:13:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:28.440 271987 INFO neutron.agent.dhcp.agent [None req-9cbca1cc-536f-4190-859a-796190812d10 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:13:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:28.440 271987 INFO neutron.agent.dhcp.agent [None req-9cbca1cc-536f-4190-859a-796190812d10 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:13:28 localhost podman[326331]: 2025-10-14 10:13:28.314723667 +0000 UTC m=+0.080288047 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:13:28 localhost podman[248187]: @ - - [14/Oct/2025:10:13:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151129 "" "Go-http-client/1.1" Oct 14 06:13:28 localhost podman[326324]: 2025-10-14 10:13:28.487610599 +0000 UTC m=+0.256515972 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:13:28 localhost podman[326325]: 2025-10-14 10:13:28.442910065 +0000 UTC m=+0.207794024 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:13:28 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:13:28 localhost podman[326331]: 2025-10-14 10:13:28.499929307 +0000 UTC m=+0.265493677 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009) Oct 14 06:13:28 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:13:28 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:28 localhost podman[326428]: 2025-10-14 10:13:28.520756487 +0000 UTC m=+0.038129383 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:13:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:28 localhost podman[326332]: 2025-10-14 10:13:28.566281665 +0000 UTC m=+0.314883274 container remove 6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e89b669-126b-40c1-acd1-6f67bd63ad7c, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:13:28 localhost podman[326325]: 2025-10-14 10:13:28.572565568 +0000 UTC m=+0.337449577 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:13:28 localhost systemd[1]: libpod-conmon-6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5.scope: Deactivated successfully. Oct 14 06:13:28 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:13:28 localhost ovn_controller[157396]: 2025-10-14T10:13:28Z|00122|binding|INFO|Releasing lport 54476a46-32a2-4a2a-aa3e-5703e42659b5 from this chassis (sb_readonly=0) Oct 14 06:13:28 localhost ovn_controller[157396]: 2025-10-14T10:13:28Z|00123|binding|INFO|Setting lport 54476a46-32a2-4a2a-aa3e-5703e42659b5 down in Southbound Oct 14 06:13:28 localhost nova_compute[297686]: 2025-10-14 10:13:28.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:28 localhost kernel: device tap54476a46-32 left promiscuous mode Oct 14 06:13:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:28.623 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2e89b669-126b-40c1-acd1-6f67bd63ad7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e89b669-126b-40c1-acd1-6f67bd63ad7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aadbca62f85049bbb5689b00ddbce91d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=020c7cea-6d1a-46a8-a024-3c5fb878cafa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=54476a46-32a2-4a2a-aa3e-5703e42659b5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:28.624 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 54476a46-32a2-4a2a-aa3e-5703e42659b5 in datapath 2e89b669-126b-40c1-acd1-6f67bd63ad7c unbound from our chassis#033[00m Oct 14 06:13:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:28.625 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2e89b669-126b-40c1-acd1-6f67bd63ad7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:13:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:28.625 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[50770b80-d9bb-4a9f-8f3b-1dac64f02d15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:13:28 localhost nova_compute[297686]: 2025-10-14 10:13:28.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:28 localhost podman[248187]: @ - - [14/Oct/2025:10:13:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19862 "" "Go-http-client/1.1" Oct 14 06:13:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:28.765 271987 INFO neutron.agent.dhcp.agent [None req-06e49fc8-3b2c-441c-925a-2f504b7de440 - - - - - -] DHCP configuration for ports {'b3970207-a24f-46b2-98ad-8106454387d4'} is completed#033[00m Oct 14 06:13:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:28.851 271987 INFO neutron.agent.dhcp.agent [None req-6e9261a5-e6c3-45f0-8e4f-237dc735ec93 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:13:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:28.851 271987 INFO neutron.agent.dhcp.agent [None req-6e9261a5-e6c3-45f0-8e4f-237dc735ec93 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:13:29 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:29.015 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:13:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e127 e127: 6 total, 6 up, 6 in Oct 14 06:13:29 localhost systemd[1]: var-lib-containers-storage-overlay-9f4a1dfb2ee24ebbf2d6109954e4b68135569f88e8788b26d12cfad4d5c14907-merged.mount: Deactivated successfully. Oct 14 06:13:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e8f9e2e4e6b3c8a199f177d2cc7be552aee2ba2f716a0be23fdd47fe58f43d5-userdata-shm.mount: Deactivated successfully. Oct 14 06:13:29 localhost systemd[1]: run-netns-qdhcp\x2d2e89b669\x2d126b\x2d40c1\x2dacd1\x2d6f67bd63ad7c.mount: Deactivated successfully. Oct 14 06:13:29 localhost systemd[1]: var-lib-containers-storage-overlay-0d5c80661321bdf5f0aa2987b744edc467c8894c3e72d3dfa565600fc5f08c39-merged.mount: Deactivated successfully. Oct 14 06:13:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b56ac32e11c19003e94e601420bacf63a41184e6198d220ce7aab41a932a8ab9-userdata-shm.mount: Deactivated successfully. Oct 14 06:13:29 localhost systemd[1]: run-netns-qdhcp\x2d5105abb4\x2d2220\x2d4dcf\x2d9429\x2d73af964fcfcf.mount: Deactivated successfully. Oct 14 06:13:29 localhost ovn_controller[157396]: 2025-10-14T10:13:29Z|00124|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:29 localhost nova_compute[297686]: 2025-10-14 10:13:29.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:29 localhost nova_compute[297686]: 2025-10-14 10:13:29.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e128 e128: 6 total, 6 up, 6 in Oct 14 06:13:29 localhost nova_compute[297686]: 2025-10-14 10:13:29.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.296 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.296 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.297 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.297 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.298 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:13:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e129 e129: 6 total, 6 up, 6 in Oct 14 06:13:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:13:30 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/673047331' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.763 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.849 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:13:30 localhost nova_compute[297686]: 2025-10-14 10:13:30.850 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.051 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.053 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11278MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.054 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.054 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.159 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.160 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.160 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.230 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:13:31 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:31.569 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:31Z, description=, device_id=70bd5818-add3-400c-a56e-63ceccc1ec75, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a14eaf5a-efe9-4ad7-a8ad-4262f62c6b5c, ip_allocation=immediate, mac_address=fa:16:3e:5e:57:3f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1110, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:31Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:13:31 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2918743247' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.723 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.730 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.750 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.753 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:13:31 localhost nova_compute[297686]: 2025-10-14 10:13:31.754 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:13:31 localhost podman[326515]: 2025-10-14 10:13:31.798368074 +0000 UTC m=+0.063403739 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:13:31 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:13:31 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:31 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:32.011 271987 INFO neutron.agent.dhcp.agent [None req-b76c2c2d-6fb6-4f9b-927f-08193bb2f609 - - - - - -] DHCP configuration for ports {'a14eaf5a-efe9-4ad7-a8ad-4262f62c6b5c'} is completed#033[00m Oct 14 06:13:32 localhost dnsmasq-dhcp[325837]: DHCPRELEASE(tapa4b30293-43) 192.168.122.227 fa:16:3e:17:d1:73 Oct 14 06:13:32 localhost nova_compute[297686]: 2025-10-14 10:13:32.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:32 localhost podman[326555]: 2025-10-14 10:13:32.670081632 +0000 UTC m=+0.051189173 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 06:13:32 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:32 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:32 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:32 localhost nova_compute[297686]: 2025-10-14 10:13:32.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:32 localhost ovn_controller[157396]: 2025-10-14T10:13:32Z|00125|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:32 localhost nova_compute[297686]: 2025-10-14 10:13:32.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:33 localhost systemd[1]: tmp-crun.GEljP7.mount: Deactivated successfully. Oct 14 06:13:33 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:33 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:33 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:33 localhost podman[326593]: 2025-10-14 10:13:33.991215026 +0000 UTC m=+0.053478554 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:13:34 localhost ovn_controller[157396]: 2025-10-14T10:13:34Z|00126|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:34 localhost nova_compute[297686]: 2025-10-14 10:13:34.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:34 localhost nova_compute[297686]: 2025-10-14 10:13:34.754 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:13:34 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:34.766 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:34Z, description=, device_id=1c2c02fc-1a5b-40c2-9734-0c378885441f, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f6826d99-adb0-4351-9aab-22eed1eb8a53, ip_allocation=immediate, mac_address=fa:16:3e:c6:6f:5d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1124, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:34Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:34 localhost nova_compute[297686]: 2025-10-14 10:13:34.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:34 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:34 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:34 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:34 localhost podman[326629]: 2025-10-14 10:13:34.961606597 +0000 UTC m=+0.042799116 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:13:35 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:35.338 271987 INFO neutron.agent.dhcp.agent [None req-6975364d-ed5a-4cd7-b311-d0688abd94ad - - - - - -] DHCP configuration for ports {'f6826d99-adb0-4351-9aab-22eed1eb8a53'} is completed#033[00m Oct 14 06:13:35 localhost systemd[1]: tmp-crun.f0pm6u.mount: Deactivated successfully. Oct 14 06:13:35 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:35 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:35 localhost podman[326667]: 2025-10-14 10:13:35.89631347 +0000 UTC m=+0.071318972 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:13:35 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:36 localhost nova_compute[297686]: 2025-10-14 10:13:36.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:36.329 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:13:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:36.330 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:13:36 localhost ovn_controller[157396]: 2025-10-14T10:13:36Z|00127|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:36 localhost nova_compute[297686]: 2025-10-14 10:13:36.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:37 localhost nova_compute[297686]: 2025-10-14 10:13:37.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:37.313 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:36Z, description=, device_id=77a779d0-6973-4a94-b166-f3a0a2bf564a, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ccba242-0ba4-488d-8034-b4b6627e734f, ip_allocation=immediate, mac_address=fa:16:3e:af:4f:b3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1144, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:37Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:37 localhost podman[326704]: 2025-10-14 10:13:37.548854145 +0000 UTC m=+0.060438727 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:13:37 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:37 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:37 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:37 localhost nova_compute[297686]: 2025-10-14 10:13:37.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:37.832 271987 INFO neutron.agent.dhcp.agent [None req-f66df7a5-284a-4d2b-b553-17f3dcd55c2d - - - - - -] DHCP configuration for ports {'1ccba242-0ba4-488d-8034-b4b6627e734f'} is completed#033[00m Oct 14 06:13:38 localhost openstack_network_exporter[250374]: ERROR 10:13:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:13:38 localhost openstack_network_exporter[250374]: ERROR 10:13:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:13:38 localhost openstack_network_exporter[250374]: ERROR 10:13:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:13:38 localhost openstack_network_exporter[250374]: ERROR 10:13:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:13:38 localhost openstack_network_exporter[250374]: Oct 14 06:13:38 localhost openstack_network_exporter[250374]: ERROR 10:13:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:13:38 localhost openstack_network_exporter[250374]: Oct 14 06:13:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:39 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:39.489 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:39Z, description=, device_id=e1110706-a6b5-47be-9b6a-a422545c8c3d, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9467cdb7-08d3-4436-99ac-41b44d72c960, ip_allocation=immediate, mac_address=fa:16:3e:fb:5d:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1152, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:39Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 e130: 6 total, 6 up, 6 in Oct 14 06:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.672082) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436819672164, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1151, "num_deletes": 263, "total_data_size": 1468953, "memory_usage": 1498768, "flush_reason": "Manual Compaction"} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436819679977, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 799631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19065, "largest_seqno": 20211, "table_properties": {"data_size": 795204, "index_size": 2026, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 11524, "raw_average_key_size": 21, "raw_value_size": 785735, "raw_average_value_size": 1482, "num_data_blocks": 87, "num_entries": 530, "num_filter_entries": 530, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436769, "oldest_key_time": 1760436769, "file_creation_time": 1760436819, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 8022 microseconds, and 4774 cpu microseconds. Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.680109) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 799631 bytes OK Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.680181) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.682617) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.682648) EVENT_LOG_v1 {"time_micros": 1760436819682638, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.682708) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1463189, prev total WAL file size 1463189, number of live WAL files 2. Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.683872) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303035' seq:72057594037927935, type:22 .. '6D6772737461740034323538' seq:0, type:0; will stop at (end) Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(780KB)], [27(16MB)] Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436819683940, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17799013, "oldest_snapshot_seqno": -1} Oct 14 06:13:39 localhost podman[326740]: 2025-10-14 10:13:39.706809647 +0000 UTC m=+0.077178693 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:13:39 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:13:39 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:39 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12404 keys, 15800231 bytes, temperature: kUnknown Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436819748943, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15800231, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15732884, "index_size": 35246, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31045, "raw_key_size": 336156, "raw_average_key_size": 27, "raw_value_size": 15524703, "raw_average_value_size": 1251, "num_data_blocks": 1305, "num_entries": 12404, "num_filter_entries": 12404, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436819, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.749380) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15800231 bytes Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.751508) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 273.0 rd, 242.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 16.2 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(42.0) write-amplify(19.8) OK, records in: 12924, records dropped: 520 output_compression: NoCompression Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.751551) EVENT_LOG_v1 {"time_micros": 1760436819751533, "job": 14, "event": "compaction_finished", "compaction_time_micros": 65207, "compaction_time_cpu_micros": 25727, "output_level": 6, "num_output_files": 1, "total_output_size": 15800231, "num_input_records": 12924, "num_output_records": 12404, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436819751901, "job": 14, "event": "table_file_deletion", "file_number": 29} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436819754961, "job": 14, "event": "table_file_deletion", "file_number": 27} Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.683825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.755051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.755056) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.755058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.755059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:13:39 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:13:39.755060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:13:39 localhost systemd[1]: tmp-crun.ppC8Hr.mount: Deactivated successfully. Oct 14 06:13:39 localhost podman[326750]: 2025-10-14 10:13:39.818550279 +0000 UTC m=+0.148805993 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Oct 14 06:13:39 localhost podman[326751]: 2025-10-14 10:13:39.789790935 +0000 UTC m=+0.125742974 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 06:13:39 localhost nova_compute[297686]: 2025-10-14 10:13:39.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:39 localhost podman[326749]: 2025-10-14 10:13:39.880379899 +0000 UTC m=+0.217622247 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:13:39 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:39.889 271987 INFO neutron.agent.dhcp.agent [None req-329de63e-c2fc-4502-98bd-01eb19b5cadf - - - - - -] DHCP configuration for ports {'9467cdb7-08d3-4436-99ac-41b44d72c960'} is completed#033[00m Oct 14 06:13:39 localhost podman[326751]: 2025-10-14 10:13:39.907092109 +0000 UTC m=+0.243044128 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:13:39 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:13:39 localhost podman[326750]: 2025-10-14 10:13:39.925967759 +0000 UTC m=+0.256223473 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 06:13:39 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:13:39 localhost podman[326749]: 2025-10-14 10:13:39.963530042 +0000 UTC m=+0.300772440 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:39 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:13:40 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:40.332 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:13:41 localhost nova_compute[297686]: 2025-10-14 10:13:41.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:41 localhost nova_compute[297686]: 2025-10-14 10:13:41.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:42 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:42.182 2 INFO neutron.agent.securitygroups_rpc [None req-0527c7af-f9c5-4631-a32d-7bf7bcdb7d05 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['18fa6a68-e215-4844-8d40-fbc027948c6c']#033[00m Oct 14 06:13:42 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:42.459 2 INFO neutron.agent.securitygroups_rpc [None req-083bb6b5-9205-407f-ba9b-c4c8032fb0d4 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['18fa6a68-e215-4844-8d40-fbc027948c6c']#033[00m Oct 14 06:13:42 localhost nova_compute[297686]: 2025-10-14 10:13:42.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:43 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:43 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:43 localhost podman[326836]: 2025-10-14 10:13:43.000507487 +0000 UTC m=+0.062599324 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 06:13:43 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:44.422 2 INFO neutron.agent.securitygroups_rpc [None req-db5556c0-4291-4c69-9d50-e157ec56caa1 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:44.782 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:44Z, description=, device_id=afb7ed85-68ad-4caf-ac27-afb0c98c0756, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=32d8cb09-9935-40c6-ac41-c17044d74ae1, ip_allocation=immediate, mac_address=fa:16:3e:94:1f:14, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1194, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:44Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:44.861 2 INFO neutron.agent.securitygroups_rpc [None req-f2ac292c-ef84-4b5c-a694-d09fa1b3803e 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:44 localhost nova_compute[297686]: 2025-10-14 10:13:44.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:45 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:13:45 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:45 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:45 localhost systemd[1]: tmp-crun.AM1rUP.mount: Deactivated successfully. Oct 14 06:13:45 localhost podman[326876]: 2025-10-14 10:13:45.005790898 +0000 UTC m=+0.054333840 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:45 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:45.204 2 INFO neutron.agent.securitygroups_rpc [None req-57fbedb8-b01f-4a0e-8c7a-92eb222baaa6 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:45.270 271987 INFO neutron.agent.dhcp.agent [None req-4073fff3-fd53-41df-b44f-7ef8b9faabac - - - - - -] DHCP configuration for ports {'32d8cb09-9935-40c6-ac41-c17044d74ae1'} is completed#033[00m Oct 14 06:13:45 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:45.971 2 INFO neutron.agent.securitygroups_rpc [None req-1ff370d2-800b-4ca9-851e-65244add5b48 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:46 localhost nova_compute[297686]: 2025-10-14 10:13:46.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:46 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:46.435 2 INFO neutron.agent.securitygroups_rpc [None req-486abca4-76a9-447a-ba2e-d070f1a46a73 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:46 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:46.791 2 INFO neutron.agent.securitygroups_rpc [None req-178455f2-b48f-405f-a703-3d79838eab5f 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:47 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:47.191 2 INFO neutron.agent.securitygroups_rpc [None req-a733e5ee-97cf-4111-870d-1addde4a2c97 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:47 localhost nova_compute[297686]: 2025-10-14 10:13:47.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:48 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:48.030 2 INFO neutron.agent.securitygroups_rpc [None req-551f5b9f-9bf9-4c95-94f1-8a21fc183e6a 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:48 localhost systemd[1]: tmp-crun.xT6PIV.mount: Deactivated successfully. Oct 14 06:13:48 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:48 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:48 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:48 localhost podman[326915]: 2025-10-14 10:13:48.376149003 +0000 UTC m=+0.062863881 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:48 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:48.493 2 INFO neutron.agent.securitygroups_rpc [None req-9ddcdf24-da6e-45e8-989e-284a653acb68 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:48 localhost ovn_controller[157396]: 2025-10-14T10:13:48Z|00128|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:48 localhost nova_compute[297686]: 2025-10-14 10:13:48.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:48 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:48 localhost podman[326951]: 2025-10-14 10:13:48.755827187 +0000 UTC m=+0.070468425 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 06:13:48 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:48 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:13:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3094652079' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:13:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:13:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3094652079' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:13:48 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:48.778 2 INFO neutron.agent.securitygroups_rpc [None req-6ab7816b-72de-46f1-aca3-41e8306cf7b8 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['d59bbc3e-1ac0-4cfe-b59e-52dd8a190279']#033[00m Oct 14 06:13:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:49.589 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:49Z, description=, device_id=6ea45190-19c0-49f1-84c4-f893fcf66678, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=71e0a901-d37a-45fa-8edb-3dad0dc7e685, ip_allocation=immediate, mac_address=fa:16:3e:65:57:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1210, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:49Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:49 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:49 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:49 localhost podman[326987]: 2025-10-14 10:13:49.79371666 +0000 UTC m=+0.053822094 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:49 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:49 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:49.850 2 INFO neutron.agent.securitygroups_rpc [None req-af3d35a0-3d99-4797-93fe-ba3c0c8619d5 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['486a2e86-116d-4ae9-86f7-271e7452bc24']#033[00m Oct 14 06:13:49 localhost nova_compute[297686]: 2025-10-14 10:13:49.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:50.043 271987 INFO neutron.agent.dhcp.agent [None req-db30f52c-4ad6-416d-a2a0-f33e96dabca1 - - - - - -] DHCP configuration for ports {'71e0a901-d37a-45fa-8edb-3dad0dc7e685'} is completed#033[00m Oct 14 06:13:51 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:51.428 2 INFO neutron.agent.securitygroups_rpc [None req-f6fca79c-9728-44c4-882c-2bc25ce11c6c 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['e38a69d3-2bbe-4b7f-80be-eb189b5e362a']#033[00m Oct 14 06:13:51 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:51.547 2 INFO neutron.agent.securitygroups_rpc [None req-16c62dab-88ad-4bfd-9aae-b337240e45da 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['e38a69d3-2bbe-4b7f-80be-eb189b5e362a']#033[00m Oct 14 06:13:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:13:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:13:51 localhost podman[327007]: 2025-10-14 10:13:51.748794669 +0000 UTC m=+0.084485677 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:13:51 localhost podman[327007]: 2025-10-14 10:13:51.757942699 +0000 UTC m=+0.093633657 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:13:51 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:13:51 localhost podman[327008]: 2025-10-14 10:13:51.845792279 +0000 UTC m=+0.181418055 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 06:13:51 localhost podman[327008]: 2025-10-14 10:13:51.881218326 +0000 UTC m=+0.216844102 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Oct 14 06:13:51 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:13:52 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:52 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:52 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:52 localhost podman[327066]: 2025-10-14 10:13:52.179597735 +0000 UTC m=+0.059047700 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:13:52 localhost nova_compute[297686]: 2025-10-14 10:13:52.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:52 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:52.769 2 INFO neutron.agent.securitygroups_rpc [None req-1e03a068-cc38-49cd-a31d-8f40ce72f97f 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['940479e4-7012-482c-a23a-4a0abd9edbc1']#033[00m Oct 14 06:13:52 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:52.993 2 INFO neutron.agent.securitygroups_rpc [None req-0c91c8fa-6f7f-4f82-8b4e-7cc1fcb4eb51 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['940479e4-7012-482c-a23a-4a0abd9edbc1']#033[00m Oct 14 06:13:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:53.440 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:52Z, description=, device_id=527f2ca1-1639-47ef-8030-ede4d031607c, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=92294608-b5b6-434d-a273-79e842bf227c, ip_allocation=immediate, mac_address=fa:16:3e:01:2b:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1242, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:53Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:53 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:13:53 localhost podman[327101]: 2025-10-14 10:13:53.688098776 +0000 UTC m=+0.074999163 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:13:53 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:53 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:53 localhost ovn_controller[157396]: 2025-10-14T10:13:53Z|00129|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:13:53 localhost nova_compute[297686]: 2025-10-14 10:13:53.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:53.910 271987 INFO neutron.agent.dhcp.agent [None req-3dc6249c-eda4-4875-a8ef-8e2f3d776d65 - - - - - -] DHCP configuration for ports {'92294608-b5b6-434d-a273-79e842bf227c'} is completed#033[00m Oct 14 06:13:54 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:54 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:54 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:54 localhost podman[327140]: 2025-10-14 10:13:54.053738792 +0000 UTC m=+0.066741588 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0) Oct 14 06:13:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:54.406 2 INFO neutron.agent.securitygroups_rpc [None req-edfa6b41-f8e8-4b3a-b07b-6ba628b8b1f8 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['b7f5a6b8-0995-4d3f-8fb3-d87e109ba0e1']#033[00m Oct 14 06:13:54 localhost nova_compute[297686]: 2025-10-14 10:13:54.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:55.191 2 INFO neutron.agent.securitygroups_rpc [None req-0772cae2-029e-4ff4-98c8-8d86758930be 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['b7f5a6b8-0995-4d3f-8fb3-d87e109ba0e1']#033[00m Oct 14 06:13:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:55.471 2 INFO neutron.agent.securitygroups_rpc [None req-a1ca05b4-df34-4178-a855-6749f59c06d9 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['b7f5a6b8-0995-4d3f-8fb3-d87e109ba0e1']#033[00m Oct 14 06:13:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:55.684 2 INFO neutron.agent.securitygroups_rpc [None req-3365dbb0-8534-4e33-9dda-3f8fb1a87a68 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['b7f5a6b8-0995-4d3f-8fb3-d87e109ba0e1']#033[00m Oct 14 06:13:55 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:13:55 localhost podman[327178]: 2025-10-14 10:13:55.764741336 +0000 UTC m=+0.039205765 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:13:55 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:55 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:56.110 2 INFO neutron.agent.securitygroups_rpc [None req-16528115-3ff4-4682-9514-f888a1e69c41 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['b7f5a6b8-0995-4d3f-8fb3-d87e109ba0e1']#033[00m Oct 14 06:13:56 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:56.367 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:13:56Z, description=, device_id=bd6fcc58-4820-4a78-bb5b-e812cab290c3, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f0c3972a-27c5-4816-98a4-c73fe4408983, ip_allocation=immediate, mac_address=fa:16:3e:86:06:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1265, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:13:56Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:13:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:56.373 2 INFO neutron.agent.securitygroups_rpc [None req-adcc8454-1ba4-4bed-a465-5ee863584558 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['b7f5a6b8-0995-4d3f-8fb3-d87e109ba0e1']#033[00m Oct 14 06:13:56 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:13:56 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:56 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:56 localhost podman[327214]: 2025-10-14 10:13:56.536614373 +0000 UTC m=+0.040896757 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:56 localhost systemd[1]: tmp-crun.mj5S2p.mount: Deactivated successfully. Oct 14 06:13:56 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:13:56.803 271987 INFO neutron.agent.dhcp.agent [None req-2a1e2db1-333c-460a-b416-ce5ab7b0731a - - - - - -] DHCP configuration for ports {'f0c3972a-27c5-4816-98a4-c73fe4408983'} is completed#033[00m Oct 14 06:13:57 localhost neutron_sriov_agent[264974]: 2025-10-14 10:13:57.112 2 INFO neutron.agent.securitygroups_rpc [None req-4043034a-33c2-49f3-ae52-f643d5505d9f 879681508c614a5bb4766b7d8eed5096 570c1aeb24aa4b61a40c43f31c4e20b7 - - default default] Security group rule updated ['f8522dc9-f9f0-4f2e-9ce4-94b34244a5fd']#033[00m Oct 14 06:13:57 localhost nova_compute[297686]: 2025-10-14 10:13:57.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:13:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:57.782 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:13:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:57.783 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:13:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:13:57.783 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:13:57 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:13:57 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:13:57 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:13:57 localhost podman[327252]: 2025-10-14 10:13:57.959814893 +0000 UTC m=+0.052439354 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:13:58 localhost podman[248187]: time="2025-10-14T10:13:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:13:58 localhost podman[248187]: @ - - [14/Oct/2025:10:13:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:13:58 localhost podman[248187]: @ - - [14/Oct/2025:10:13:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19863 "" "Go-http-client/1.1" Oct 14 06:13:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:13:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:13:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:13:58 localhost podman[327274]: 2025-10-14 10:13:58.746657294 +0000 UTC m=+0.084144507 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:13:58 localhost podman[327272]: 2025-10-14 10:13:58.777102037 +0000 UTC m=+0.119436321 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd) Oct 14 06:13:58 localhost podman[327272]: 2025-10-14 10:13:58.787003684 +0000 UTC m=+0.129337958 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:13:58 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:13:58 localhost podman[327274]: 2025-10-14 10:13:58.830027826 +0000 UTC m=+0.167515049 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0) Oct 14 06:13:58 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:13:58 localhost podman[327273]: 2025-10-14 10:13:58.850867361 +0000 UTC m=+0.190086247 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:13:58 localhost podman[327273]: 2025-10-14 10:13:58.857895539 +0000 UTC m=+0.197114415 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:13:58 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:13:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:13:59 localhost nova_compute[297686]: 2025-10-14 10:13:59.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:02 localhost nova_compute[297686]: 2025-10-14 10:14:02.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:04 localhost nova_compute[297686]: 2025-10-14 10:14:04.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:07 localhost nova_compute[297686]: 2025-10-14 10:14:07.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:08 localhost sshd[327332]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:14:08 localhost openstack_network_exporter[250374]: ERROR 10:14:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:14:08 localhost openstack_network_exporter[250374]: ERROR 10:14:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:14:08 localhost openstack_network_exporter[250374]: ERROR 10:14:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:14:08 localhost openstack_network_exporter[250374]: ERROR 10:14:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:14:08 localhost openstack_network_exporter[250374]: Oct 14 06:14:08 localhost openstack_network_exporter[250374]: ERROR 10:14:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:14:08 localhost openstack_network_exporter[250374]: Oct 14 06:14:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:14:10 localhost systemd[1]: tmp-crun.MoBQJr.mount: Deactivated successfully. Oct 14 06:14:10 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:10.351 271987 INFO neutron.agent.linux.ip_lib [None req-088a7c3d-db6c-41e0-bf99-b28e4d5f1084 - - - - - -] Device tapc15d7099-2f cannot be used as it has no MAC address#033[00m Oct 14 06:14:10 localhost podman[327344]: 2025-10-14 10:14:10.367744512 +0000 UTC m=+0.094207469 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost podman[327344]: 2025-10-14 10:14:10.380040693 +0000 UTC m=+0.106503650 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:14:10 localhost kernel: device tapc15d7099-2f entered promiscuous mode Oct 14 06:14:10 localhost ovn_controller[157396]: 2025-10-14T10:14:10Z|00130|binding|INFO|Claiming lport c15d7099-2fc6-4909-a860-4177c78e3521 for this chassis. Oct 14 06:14:10 localhost ovn_controller[157396]: 2025-10-14T10:14:10Z|00131|binding|INFO|c15d7099-2fc6-4909-a860-4177c78e3521: Claiming unknown Oct 14 06:14:10 localhost NetworkManager[5977]: [1760436850.3889] manager: (tapc15d7099-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/26) Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost systemd-udevd[327394]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:14:10 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:10.400 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-facf8644-ef1a-475d-ac27-20217b05a00d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-facf8644-ef1a-475d-ac27-20217b05a00d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e549874548c54dd8b3b10588bdd2eec9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3b66c3b-a47b-46d3-bf07-94622ef97a0f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c15d7099-2fc6-4909-a860-4177c78e3521) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:14:10 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:10.402 163055 INFO neutron.agent.ovn.metadata.agent [-] Port c15d7099-2fc6-4909-a860-4177c78e3521 in datapath facf8644-ef1a-475d-ac27-20217b05a00d bound to our chassis#033[00m Oct 14 06:14:10 localhost ovn_controller[157396]: 2025-10-14T10:14:10Z|00132|binding|INFO|Setting lport c15d7099-2fc6-4909-a860-4177c78e3521 ovn-installed in OVS Oct 14 06:14:10 localhost ovn_controller[157396]: 2025-10-14T10:14:10Z|00133|binding|INFO|Setting lport c15d7099-2fc6-4909-a860-4177c78e3521 up in Southbound Oct 14 06:14:10 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:10.408 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network facf8644-ef1a-475d-ac27-20217b05a00d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:10.409 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[9167bbee-07ef-4021-9c13-dee9146784ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:14:10 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost journal[237477]: ethtool ioctl error on tapc15d7099-2f: No such device Oct 14 06:14:10 localhost podman[327336]: 2025-10-14 10:14:10.350741894 +0000 UTC m=+0.090538004 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:10 localhost podman[327337]: 2025-10-14 10:14:10.458643937 +0000 UTC m=+0.189521391 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:14:10 localhost nova_compute[297686]: 2025-10-14 10:14:10.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:10 localhost podman[327336]: 2025-10-14 10:14:10.482159875 +0000 UTC m=+0.221955965 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:14:10 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:14:10 localhost podman[327337]: 2025-10-14 10:14:10.495207469 +0000 UTC m=+0.226084913 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:14:10 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:14:11 localhost podman[327475]: Oct 14 06:14:11 localhost podman[327475]: 2025-10-14 10:14:11.405454493 +0000 UTC m=+0.094840659 container create db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 06:14:11 localhost systemd[1]: Started libpod-conmon-db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939.scope. Oct 14 06:14:11 localhost podman[327475]: 2025-10-14 10:14:11.359534401 +0000 UTC m=+0.048920587 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:14:11 localhost systemd[1]: tmp-crun.FSKfJD.mount: Deactivated successfully. Oct 14 06:14:11 localhost systemd[1]: Started libcrun container. Oct 14 06:14:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd81577dd3c6bf8ccc6b483aa6db60f256dcb754f9805520a56b3cbc77c15683/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:14:11 localhost podman[327475]: 2025-10-14 10:14:11.496940537 +0000 UTC m=+0.186326693 container init db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:11 localhost podman[327475]: 2025-10-14 10:14:11.506302726 +0000 UTC m=+0.195688882 container start db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:11 localhost dnsmasq[327493]: started, version 2.85 cachesize 150 Oct 14 06:14:11 localhost dnsmasq[327493]: DNS service limited to local subnets Oct 14 06:14:11 localhost dnsmasq[327493]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:14:11 localhost dnsmasq[327493]: warning: no upstream servers configured Oct 14 06:14:11 localhost dnsmasq-dhcp[327493]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:14:11 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 0 addresses Oct 14 06:14:11 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:11 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:11 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:11.691 271987 INFO neutron.agent.dhcp.agent [None req-c4d61b6b-4121-4779-9af8-c4357fbe5131 - - - - - -] DHCP configuration for ports {'0035c44e-ef81-4156-be97-3c5fe8b40daf'} is completed#033[00m Oct 14 06:14:12 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:12.448 2 INFO neutron.agent.securitygroups_rpc [None req-a59b2844-7b34-4d6c-877c-dea7c23306b9 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:12 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:12.478 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:12Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7d6217b6-0a0f-477e-acbd-e3becdaa7a09, ip_allocation=immediate, mac_address=fa:16:3e:16:ff:9e, name=tempest-AllowedAddressPairTestJSON-1755245480, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:08Z, description=, dns_domain=, id=facf8644-ef1a-475d-ac27-20217b05a00d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-365348510, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35284, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1317, status=ACTIVE, subnets=['4b0d4865-dc69-4840-b6af-c1e697a5fd12'], tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:09Z, vlan_transparent=None, network_id=facf8644-ef1a-475d-ac27-20217b05a00d, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df'], standard_attr_id=1356, status=DOWN, tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:12Z on network facf8644-ef1a-475d-ac27-20217b05a00d#033[00m Oct 14 06:14:12 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 1 addresses Oct 14 06:14:12 localhost systemd[1]: tmp-crun.z2LGvj.mount: Deactivated successfully. Oct 14 06:14:12 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:12 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:12 localhost podman[327511]: 2025-10-14 10:14:12.688925875 +0000 UTC m=+0.059963727 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:14:12 localhost nova_compute[297686]: 2025-10-14 10:14:12.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:13.020 271987 INFO neutron.agent.dhcp.agent [None req-8f078efe-2dc7-42e7-b9a9-147c0e74aeac - - - - - -] DHCP configuration for ports {'7d6217b6-0a0f-477e-acbd-e3becdaa7a09'} is completed#033[00m Oct 14 06:14:13 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:13.152 2 INFO neutron.agent.securitygroups_rpc [None req-eae30e61-3493-4ccf-8e72-0db56a05d689 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:13.205 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:12Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=304a3301-e2e0-4f55-968c-2fb38c434091, ip_allocation=immediate, mac_address=fa:16:3e:ca:a6:80, name=tempest-AllowedAddressPairTestJSON-2091787617, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:08Z, description=, dns_domain=, id=facf8644-ef1a-475d-ac27-20217b05a00d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-365348510, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35284, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1317, status=ACTIVE, subnets=['4b0d4865-dc69-4840-b6af-c1e697a5fd12'], tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:09Z, vlan_transparent=None, network_id=facf8644-ef1a-475d-ac27-20217b05a00d, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df'], standard_attr_id=1360, status=DOWN, tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:12Z on network facf8644-ef1a-475d-ac27-20217b05a00d#033[00m Oct 14 06:14:13 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 2 addresses Oct 14 06:14:13 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:13 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:13 localhost podman[327549]: 2025-10-14 10:14:13.402067104 +0000 UTC m=+0.045178600 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:14:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:13.658 271987 INFO neutron.agent.dhcp.agent [None req-86cd9351-68e4-4709-9dec-1a89fc5a42a5 - - - - - -] DHCP configuration for ports {'304a3301-e2e0-4f55-968c-2fb38c434091'} is completed#033[00m Oct 14 06:14:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:14.023 2 INFO neutron.agent.securitygroups_rpc [None req-21c1a1e4-ef12-49e3-b011-67f8db036333 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:14 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 1 addresses Oct 14 06:14:14 localhost podman[327587]: 2025-10-14 10:14:14.394587495 +0000 UTC m=+0.078246175 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:14 localhost systemd[1]: tmp-crun.ByF60p.mount: Deactivated successfully. Oct 14 06:14:14 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:14 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:14.541 271987 INFO neutron.agent.linux.ip_lib [None req-4bef602d-9592-476f-8f0b-324ebe6a6763 - - - - - -] Device tap9b189250-37 cannot be used as it has no MAC address#033[00m Oct 14 06:14:14 localhost nova_compute[297686]: 2025-10-14 10:14:14.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:14 localhost kernel: device tap9b189250-37 entered promiscuous mode Oct 14 06:14:14 localhost ovn_controller[157396]: 2025-10-14T10:14:14Z|00134|binding|INFO|Claiming lport 9b189250-3719-4004-8776-c0c51f8bb3d9 for this chassis. Oct 14 06:14:14 localhost ovn_controller[157396]: 2025-10-14T10:14:14Z|00135|binding|INFO|9b189250-3719-4004-8776-c0c51f8bb3d9: Claiming unknown Oct 14 06:14:14 localhost NetworkManager[5977]: [1760436854.5761] manager: (tap9b189250-37): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Oct 14 06:14:14 localhost nova_compute[297686]: 2025-10-14 10:14:14.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:14 localhost systemd-udevd[327619]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:14:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:14.584 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-704cdb9a-d532-4e6e-ad10-bf557871ac2d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-704cdb9a-d532-4e6e-ad10-bf557871ac2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74bb29c117814a7892a70c60930de045', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c4adad9-17db-4db7-a3bb-c888ac6452d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9b189250-3719-4004-8776-c0c51f8bb3d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:14:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:14.586 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 9b189250-3719-4004-8776-c0c51f8bb3d9 in datapath 704cdb9a-d532-4e6e-ad10-bf557871ac2d bound to our chassis#033[00m Oct 14 06:14:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:14.589 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port f2ca5731-8738-46ac-afcf-a72db7b98e83 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:14:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:14.589 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 704cdb9a-d532-4e6e-ad10-bf557871ac2d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:14:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:14.590 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[9e22d5b5-82b4-4fa2-bda8-6f3c65c61958]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost ovn_controller[157396]: 2025-10-14T10:14:14Z|00136|binding|INFO|Setting lport 9b189250-3719-4004-8776-c0c51f8bb3d9 ovn-installed in OVS Oct 14 06:14:14 localhost ovn_controller[157396]: 2025-10-14T10:14:14Z|00137|binding|INFO|Setting lport 9b189250-3719-4004-8776-c0c51f8bb3d9 up in Southbound Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost nova_compute[297686]: 2025-10-14 10:14:14.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost journal[237477]: ethtool ioctl error on tap9b189250-37: No such device Oct 14 06:14:14 localhost nova_compute[297686]: 2025-10-14 10:14:14.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:14 localhost nova_compute[297686]: 2025-10-14 10:14:14.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:14.710 2 INFO neutron.agent.securitygroups_rpc [None req-215ed238-aca1-4993-9200-eaa35ce32f87 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:14.748 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:14Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=554caba5-1d0c-4f7d-9cd8-390e1e6ac51c, ip_allocation=immediate, mac_address=fa:16:3e:c9:d5:11, name=tempest-AllowedAddressPairTestJSON-1781892166, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:08Z, description=, dns_domain=, id=facf8644-ef1a-475d-ac27-20217b05a00d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-365348510, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35284, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1317, status=ACTIVE, subnets=['4b0d4865-dc69-4840-b6af-c1e697a5fd12'], tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:09Z, vlan_transparent=None, network_id=facf8644-ef1a-475d-ac27-20217b05a00d, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df'], standard_attr_id=1366, status=DOWN, tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:14Z on network facf8644-ef1a-475d-ac27-20217b05a00d#033[00m Oct 14 06:14:14 localhost podman[327670]: 2025-10-14 10:14:14.987321954 +0000 UTC m=+0.062711274 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:14 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 2 addresses Oct 14 06:14:14 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:14 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:15 localhost nova_compute[297686]: 2025-10-14 10:14:15.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:15.326 271987 INFO neutron.agent.dhcp.agent [None req-620b1d96-0904-4ae0-a77e-f5f4c0589019 - - - - - -] DHCP configuration for ports {'554caba5-1d0c-4f7d-9cd8-390e1e6ac51c'} is completed#033[00m Oct 14 06:14:15 localhost podman[327730]: Oct 14 06:14:15 localhost podman[327730]: 2025-10-14 10:14:15.666989595 +0000 UTC m=+0.094091255 container create 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:14:15 localhost podman[327730]: 2025-10-14 10:14:15.620479395 +0000 UTC m=+0.047581125 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:14:15 localhost systemd[1]: Started libpod-conmon-58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f.scope. Oct 14 06:14:15 localhost systemd[1]: Started libcrun container. Oct 14 06:14:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:15.751 2 INFO neutron.agent.securitygroups_rpc [None req-d4a75c8c-e3f1-4a31-8229-40ae2dee1d6f a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886dcf92520db09767dd9dd1a5e0f4387308c67d233af327aab2a1c68c94e5d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:14:15 localhost podman[327730]: 2025-10-14 10:14:15.764578318 +0000 UTC m=+0.191679968 container init 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 06:14:15 localhost podman[327730]: 2025-10-14 10:14:15.773780942 +0000 UTC m=+0.200882592 container start 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:14:15 localhost dnsmasq[327748]: started, version 2.85 cachesize 150 Oct 14 06:14:15 localhost dnsmasq[327748]: DNS service limited to local subnets Oct 14 06:14:15 localhost dnsmasq[327748]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:14:15 localhost dnsmasq[327748]: warning: no upstream servers configured Oct 14 06:14:15 localhost dnsmasq-dhcp[327748]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:14:15 localhost dnsmasq[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/addn_hosts - 0 addresses Oct 14 06:14:15 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/host Oct 14 06:14:15 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/opts Oct 14 06:14:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:15.835 271987 INFO neutron.agent.dhcp.agent [None req-80dfdbc7-ef59-4b16-987c-52bf4d1e5de3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:14Z, description=, device_id=ce6f3c9b-b135-4061-92ba-08549ea6a00b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=df0c92b6-f569-4379-90f2-db80320e905e, ip_allocation=immediate, mac_address=fa:16:3e:a0:cf:92, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:11Z, description=, dns_domain=, id=704cdb9a-d532-4e6e-ad10-bf557871ac2d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-591550981, port_security_enabled=True, project_id=74bb29c117814a7892a70c60930de045, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13037, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1353, status=ACTIVE, subnets=['86ace020-2958-4ba5-8a5a-4e5957749612'], tags=[], tenant_id=74bb29c117814a7892a70c60930de045, updated_at=2025-10-14T10:14:12Z, vlan_transparent=None, network_id=704cdb9a-d532-4e6e-ad10-bf557871ac2d, port_security_enabled=False, project_id=74bb29c117814a7892a70c60930de045, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1365, status=DOWN, tags=[], tenant_id=74bb29c117814a7892a70c60930de045, updated_at=2025-10-14T10:14:14Z on network 704cdb9a-d532-4e6e-ad10-bf557871ac2d#033[00m Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.001 271987 INFO neutron.agent.dhcp.agent [None req-809b1531-6efd-417d-932c-0181ba1070ca - - - - - -] DHCP configuration for ports {'c55b2fa6-d426-4c06-9377-3b0539101fe4'} is completed#033[00m Oct 14 06:14:16 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 1 addresses Oct 14 06:14:16 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:16 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:16 localhost podman[327777]: 2025-10-14 10:14:16.025651193 +0000 UTC m=+0.075289593 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2) Oct 14 06:14:16 localhost dnsmasq[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/addn_hosts - 1 addresses Oct 14 06:14:16 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/host Oct 14 06:14:16 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/opts Oct 14 06:14:16 localhost podman[327795]: 2025-10-14 10:14:16.065993294 +0000 UTC m=+0.052009663 container kill 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.254 271987 INFO neutron.agent.dhcp.agent [None req-0be55dce-5e00-49fe-bec3-c11e3da38303 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:14Z, description=, device_id=ce6f3c9b-b135-4061-92ba-08549ea6a00b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=df0c92b6-f569-4379-90f2-db80320e905e, ip_allocation=immediate, mac_address=fa:16:3e:a0:cf:92, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:11Z, description=, dns_domain=, id=704cdb9a-d532-4e6e-ad10-bf557871ac2d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-591550981, port_security_enabled=True, project_id=74bb29c117814a7892a70c60930de045, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13037, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1353, status=ACTIVE, subnets=['86ace020-2958-4ba5-8a5a-4e5957749612'], tags=[], tenant_id=74bb29c117814a7892a70c60930de045, updated_at=2025-10-14T10:14:12Z, vlan_transparent=None, network_id=704cdb9a-d532-4e6e-ad10-bf557871ac2d, port_security_enabled=False, project_id=74bb29c117814a7892a70c60930de045, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1365, status=DOWN, tags=[], tenant_id=74bb29c117814a7892a70c60930de045, updated_at=2025-10-14T10:14:14Z on network 704cdb9a-d532-4e6e-ad10-bf557871ac2d#033[00m Oct 14 06:14:16 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:16.258 2 INFO neutron.agent.securitygroups_rpc [None req-4ee0c8d5-ef25-4663-b5ce-0f405265f491 2bf00e4bfd1e4117ae57dbbe3abd93b3 74bb29c117814a7892a70c60930de045 - - default default] Security group member updated ['c8cf527d-e0a1-47be-bc6f-70f653cc7616']#033[00m Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.361 271987 INFO neutron.agent.dhcp.agent [None req-c0157b3e-31a7-40ef-b894-6491ac06e90b - - - - - -] DHCP configuration for ports {'df0c92b6-f569-4379-90f2-db80320e905e'} is completed#033[00m Oct 14 06:14:16 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:16.404 2 INFO neutron.agent.securitygroups_rpc [None req-d7dd8a2a-b9a1-4843-8dc0-74e7154c035a a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.442 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:16Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d776bd7a-0bde-451f-ae6f-5da34a3c7a66, ip_allocation=immediate, mac_address=fa:16:3e:cd:e8:73, name=tempest-AllowedAddressPairTestJSON-553068624, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:08Z, description=, dns_domain=, id=facf8644-ef1a-475d-ac27-20217b05a00d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-365348510, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35284, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1317, status=ACTIVE, subnets=['4b0d4865-dc69-4840-b6af-c1e697a5fd12'], tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:09Z, vlan_transparent=None, network_id=facf8644-ef1a-475d-ac27-20217b05a00d, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df'], standard_attr_id=1368, status=DOWN, tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:16Z on network facf8644-ef1a-475d-ac27-20217b05a00d#033[00m Oct 14 06:14:16 localhost dnsmasq[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/addn_hosts - 1 addresses Oct 14 06:14:16 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/host Oct 14 06:14:16 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/opts Oct 14 06:14:16 localhost podman[327842]: 2025-10-14 10:14:16.474710423 +0000 UTC m=+0.051796916 container kill 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.633 271987 INFO neutron.agent.dhcp.agent [None req-97ba2deb-030b-46cd-8f71-d206df2e9c4d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:16Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=eaa4e3fa-645f-4b07-ac8b-aa804ce75e54, ip_allocation=immediate, mac_address=fa:16:3e:7d:3e:e1, name=tempest-FloatingIPNegativeTestJSON-180315526, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:11Z, description=, dns_domain=, id=704cdb9a-d532-4e6e-ad10-bf557871ac2d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-591550981, port_security_enabled=True, project_id=74bb29c117814a7892a70c60930de045, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13037, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1353, status=ACTIVE, subnets=['86ace020-2958-4ba5-8a5a-4e5957749612'], tags=[], tenant_id=74bb29c117814a7892a70c60930de045, updated_at=2025-10-14T10:14:12Z, vlan_transparent=None, network_id=704cdb9a-d532-4e6e-ad10-bf557871ac2d, port_security_enabled=True, project_id=74bb29c117814a7892a70c60930de045, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c8cf527d-e0a1-47be-bc6f-70f653cc7616'], standard_attr_id=1367, status=DOWN, tags=[], tenant_id=74bb29c117814a7892a70c60930de045, updated_at=2025-10-14T10:14:16Z on network 704cdb9a-d532-4e6e-ad10-bf557871ac2d#033[00m Oct 14 06:14:16 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 2 addresses Oct 14 06:14:16 localhost podman[327877]: 2025-10-14 10:14:16.666470062 +0000 UTC m=+0.051439134 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 06:14:16 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:16 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.707 271987 INFO neutron.agent.dhcp.agent [None req-45bbd5b2-3a76-49a7-a229-1ff91ed83b3f - - - - - -] DHCP configuration for ports {'df0c92b6-f569-4379-90f2-db80320e905e'} is completed#033[00m Oct 14 06:14:16 localhost podman[327914]: 2025-10-14 10:14:16.872130351 +0000 UTC m=+0.059141962 container kill 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:16 localhost dnsmasq[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/addn_hosts - 2 addresses Oct 14 06:14:16 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/host Oct 14 06:14:16 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/opts Oct 14 06:14:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:16.970 271987 INFO neutron.agent.dhcp.agent [None req-3a481c6b-9ce5-4f41-b43c-ec25cad3f484 - - - - - -] DHCP configuration for ports {'d776bd7a-0bde-451f-ae6f-5da34a3c7a66'} is completed#033[00m Oct 14 06:14:17 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:17.128 271987 INFO neutron.agent.dhcp.agent [None req-7487b9e0-eb0c-4fa6-8a4b-344a969dc268 - - - - - -] DHCP configuration for ports {'eaa4e3fa-645f-4b07-ac8b-aa804ce75e54'} is completed#033[00m Oct 14 06:14:17 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:17.441 2 INFO neutron.agent.securitygroups_rpc [None req-f5346351-0242-4565-bdf4-882c4ddf9389 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:17 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 1 addresses Oct 14 06:14:17 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:17 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:17 localhost podman[327952]: 2025-10-14 10:14:17.683189832 +0000 UTC m=+0.060409132 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:14:17 localhost nova_compute[297686]: 2025-10-14 10:14:17.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:17 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:17.926 2 INFO neutron.agent.securitygroups_rpc [None req-65a90c9a-34f3-4407-8f07-fa618ceea691 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:17 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:17.961 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:17Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ccb0ebe2-d698-4caa-ac2f-8ae6936e8c7f, ip_allocation=immediate, mac_address=fa:16:3e:c0:57:c9, name=tempest-AllowedAddressPairTestJSON-2139443907, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:08Z, description=, dns_domain=, id=facf8644-ef1a-475d-ac27-20217b05a00d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-365348510, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35284, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1317, status=ACTIVE, subnets=['4b0d4865-dc69-4840-b6af-c1e697a5fd12'], tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:09Z, vlan_transparent=None, network_id=facf8644-ef1a-475d-ac27-20217b05a00d, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df'], standard_attr_id=1371, status=DOWN, tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:17Z on network facf8644-ef1a-475d-ac27-20217b05a00d#033[00m Oct 14 06:14:18 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 2 addresses Oct 14 06:14:18 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:18 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:18 localhost podman[327988]: 2025-10-14 10:14:18.182400434 +0000 UTC m=+0.047073769 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:14:18 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:18.377 2 INFO neutron.agent.securitygroups_rpc [None req-29c296f1-5053-45b3-a790-9ffaf0558f46 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:18.410 271987 INFO neutron.agent.dhcp.agent [None req-5c0bb8dd-f834-4a50-9041-83052d8eafe6 - - - - - -] DHCP configuration for ports {'ccb0ebe2-d698-4caa-ac2f-8ae6936e8c7f'} is completed#033[00m Oct 14 06:14:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:18.436 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=312accf9-fe88-4610-81dc-aea2ea04fb66, ip_allocation=immediate, mac_address=fa:16:3e:61:a6:7d, name=tempest-AllowedAddressPairTestJSON-1568377816, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:08Z, description=, dns_domain=, id=facf8644-ef1a-475d-ac27-20217b05a00d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-365348510, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35284, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1317, status=ACTIVE, subnets=['4b0d4865-dc69-4840-b6af-c1e697a5fd12'], tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:09Z, vlan_transparent=None, network_id=facf8644-ef1a-475d-ac27-20217b05a00d, port_security_enabled=True, project_id=e549874548c54dd8b3b10588bdd2eec9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df'], standard_attr_id=1372, status=DOWN, tags=[], tenant_id=e549874548c54dd8b3b10588bdd2eec9, updated_at=2025-10-14T10:14:18Z on network facf8644-ef1a-475d-ac27-20217b05a00d#033[00m Oct 14 06:14:18 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 3 addresses Oct 14 06:14:18 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:18 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:18 localhost podman[328025]: 2025-10-14 10:14:18.686924401 +0000 UTC m=+0.067964526 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:14:18 localhost nova_compute[297686]: 2025-10-14 10:14:18.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:18.718 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:14:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:18.719 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:14:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:18.721 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:14:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:19.034 271987 INFO neutron.agent.dhcp.agent [None req-d5702bf3-5cac-4103-8abc-749552eb17bb - - - - - -] DHCP configuration for ports {'312accf9-fe88-4610-81dc-aea2ea04fb66'} is completed#033[00m Oct 14 06:14:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:19.665 2 INFO neutron.agent.securitygroups_rpc [None req-7d1da377-3db0-465e-8a58-6eacec09542a 2bf00e4bfd1e4117ae57dbbe3abd93b3 74bb29c117814a7892a70c60930de045 - - default default] Security group member updated ['c8cf527d-e0a1-47be-bc6f-70f653cc7616']#033[00m Oct 14 06:14:19 localhost podman[328064]: 2025-10-14 10:14:19.893276095 +0000 UTC m=+0.060693951 container kill 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:14:19 localhost dnsmasq[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/addn_hosts - 1 addresses Oct 14 06:14:19 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/host Oct 14 06:14:19 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/opts Oct 14 06:14:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:19.921 2 INFO neutron.agent.securitygroups_rpc [None req-1429dfc9-965f-4083-a883-4b569e676e00 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:20 localhost nova_compute[297686]: 2025-10-14 10:14:20.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:20 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 2 addresses Oct 14 06:14:20 localhost podman[328103]: 2025-10-14 10:14:20.183893327 +0000 UTC m=+0.066378388 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:14:20 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:20 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:20 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:20.398 2 INFO neutron.agent.securitygroups_rpc [None req-c4020e51-22f0-42ab-a8f1-5443a3e98fd7 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:20 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 1 addresses Oct 14 06:14:20 localhost podman[328143]: 2025-10-14 10:14:20.635166074 +0000 UTC m=+0.060381112 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:20 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:20 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:20 localhost podman[328171]: 2025-10-14 10:14:20.795040016 +0000 UTC m=+0.071392213 container kill 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 06:14:20 localhost dnsmasq[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/addn_hosts - 0 addresses Oct 14 06:14:20 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/host Oct 14 06:14:20 localhost dnsmasq-dhcp[327748]: read /var/lib/neutron/dhcp/704cdb9a-d532-4e6e-ad10-bf557871ac2d/opts Oct 14 06:14:20 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:20.945 2 INFO neutron.agent.securitygroups_rpc [None req-f019390f-eadc-409e-adae-a8c86aad14a7 a119d95f2fc3446290208c405f40fc06 e549874548c54dd8b3b10588bdd2eec9 - - default default] Security group member updated ['f94f4b4b-b4b4-4fbc-9c6e-a13e840806df']#033[00m Oct 14 06:14:21 localhost nova_compute[297686]: 2025-10-14 10:14:21.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:21 localhost kernel: device tap9b189250-37 left promiscuous mode Oct 14 06:14:21 localhost ovn_controller[157396]: 2025-10-14T10:14:21Z|00138|binding|INFO|Releasing lport 9b189250-3719-4004-8776-c0c51f8bb3d9 from this chassis (sb_readonly=0) Oct 14 06:14:21 localhost ovn_controller[157396]: 2025-10-14T10:14:21Z|00139|binding|INFO|Setting lport 9b189250-3719-4004-8776-c0c51f8bb3d9 down in Southbound Oct 14 06:14:21 localhost nova_compute[297686]: 2025-10-14 10:14:21.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:21.051 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-704cdb9a-d532-4e6e-ad10-bf557871ac2d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-704cdb9a-d532-4e6e-ad10-bf557871ac2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74bb29c117814a7892a70c60930de045', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c4adad9-17db-4db7-a3bb-c888ac6452d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9b189250-3719-4004-8776-c0c51f8bb3d9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:14:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:21.053 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 9b189250-3719-4004-8776-c0c51f8bb3d9 in datapath 704cdb9a-d532-4e6e-ad10-bf557871ac2d unbound from our chassis#033[00m Oct 14 06:14:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:21.055 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 704cdb9a-d532-4e6e-ad10-bf557871ac2d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:14:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:21.056 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[af2a5790-267e-4111-90e1-406226572534]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:14:21 localhost dnsmasq[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/addn_hosts - 0 addresses Oct 14 06:14:21 localhost podman[328216]: 2025-10-14 10:14:21.292711469 +0000 UTC m=+0.061950909 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:14:21 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/host Oct 14 06:14:21 localhost dnsmasq-dhcp[327493]: read /var/lib/neutron/dhcp/facf8644-ef1a-475d-ac27-20217b05a00d/opts Oct 14 06:14:22 localhost ovn_controller[157396]: 2025-10-14T10:14:22Z|00140|binding|INFO|Removing iface tapc15d7099-2f ovn-installed in OVS Oct 14 06:14:22 localhost ovn_controller[157396]: 2025-10-14T10:14:22Z|00141|binding|INFO|Removing lport c15d7099-2fc6-4909-a860-4177c78e3521 ovn-installed in OVS Oct 14 06:14:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:22.090 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c6de2d81-45b5-4689-bb8d-07c8869a6223 with type ""#033[00m Oct 14 06:14:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:22.091 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-facf8644-ef1a-475d-ac27-20217b05a00d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-facf8644-ef1a-475d-ac27-20217b05a00d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e549874548c54dd8b3b10588bdd2eec9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3b66c3b-a47b-46d3-bf07-94622ef97a0f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c15d7099-2fc6-4909-a860-4177c78e3521) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:14:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:22.093 163055 INFO neutron.agent.ovn.metadata.agent [-] Port c15d7099-2fc6-4909-a860-4177c78e3521 in datapath facf8644-ef1a-475d-ac27-20217b05a00d unbound from our chassis#033[00m Oct 14 06:14:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:22.096 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network facf8644-ef1a-475d-ac27-20217b05a00d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:14:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:22.097 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[ec01f3a5-501d-4f4f-ad12-5cf4166f088b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:14:22 localhost systemd[1]: tmp-crun.59LCqV.mount: Deactivated successfully. Oct 14 06:14:22 localhost nova_compute[297686]: 2025-10-14 10:14:22.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:22 localhost podman[328271]: 2025-10-14 10:14:22.129800557 +0000 UTC m=+0.114236180 container kill 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:22 localhost dnsmasq[327748]: exiting on receipt of SIGTERM Oct 14 06:14:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:14:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:14:22 localhost systemd[1]: libpod-58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f.scope: Deactivated successfully. Oct 14 06:14:22 localhost dnsmasq[327493]: exiting on receipt of SIGTERM Oct 14 06:14:22 localhost podman[328284]: 2025-10-14 10:14:22.169848187 +0000 UTC m=+0.093430224 container kill db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:22 localhost systemd[1]: libpod-db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939.scope: Deactivated successfully. Oct 14 06:14:22 localhost podman[328299]: 2025-10-14 10:14:22.197957299 +0000 UTC m=+0.044722687 container died 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:14:22 localhost podman[328333]: 2025-10-14 10:14:22.239303879 +0000 UTC m=+0.048535484 container died db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:14:22 localhost podman[328333]: 2025-10-14 10:14:22.264937402 +0000 UTC m=+0.074168977 container cleanup db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:22 localhost systemd[1]: libpod-conmon-db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939.scope: Deactivated successfully. Oct 14 06:14:22 localhost podman[328297]: 2025-10-14 10:14:22.281389442 +0000 UTC m=+0.134021322 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:14:22 localhost systemd[1]: var-lib-containers-storage-overlay-bd81577dd3c6bf8ccc6b483aa6db60f256dcb754f9805520a56b3cbc77c15683-merged.mount: Deactivated successfully. Oct 14 06:14:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939-userdata-shm.mount: Deactivated successfully. Oct 14 06:14:22 localhost podman[328335]: 2025-10-14 10:14:22.335585351 +0000 UTC m=+0.128462520 container remove db7cd5133cd02b0f297aafd5f57c2c4521c6ac60b5d104a79efcb54a1d9cd939 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-facf8644-ef1a-475d-ac27-20217b05a00d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:22 localhost kernel: device tapc15d7099-2f left promiscuous mode Oct 14 06:14:22 localhost podman[328297]: 2025-10-14 10:14:22.351498133 +0000 UTC m=+0.204130023 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:14:22 localhost nova_compute[297686]: 2025-10-14 10:14:22.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:22 localhost nova_compute[297686]: 2025-10-14 10:14:22.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:22 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:14:22 localhost systemd[1]: run-netns-qdhcp\x2dfacf8644\x2def1a\x2d475d\x2dac27\x2d20217b05a00d.mount: Deactivated successfully. Oct 14 06:14:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:22.397 271987 INFO neutron.agent.dhcp.agent [None req-716e37d0-3451-4bcd-a400-ce155a1daeea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:14:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:22.398 271987 INFO neutron.agent.dhcp.agent [None req-716e37d0-3451-4bcd-a400-ce155a1daeea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:14:22 localhost podman[328298]: 2025-10-14 10:14:22.403500455 +0000 UTC m=+0.252554444 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true) Oct 14 06:14:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f-userdata-shm.mount: Deactivated successfully. Oct 14 06:14:22 localhost podman[328298]: 2025-10-14 10:14:22.414941219 +0000 UTC m=+0.263995198 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:14:22 localhost systemd[1]: var-lib-containers-storage-overlay-886dcf92520db09767dd9dd1a5e0f4387308c67d233af327aab2a1c68c94e5d5-merged.mount: Deactivated successfully. Oct 14 06:14:22 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:14:22 localhost podman[328299]: 2025-10-14 10:14:22.441319475 +0000 UTC m=+0.288084873 container remove 58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-704cdb9a-d532-4e6e-ad10-bf557871ac2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:22 localhost systemd[1]: libpod-conmon-58dcbdc90baaa0df1cc02662506de03107ae60de781f7a48453944cac9bbb13f.scope: Deactivated successfully. Oct 14 06:14:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:22.467 271987 INFO neutron.agent.dhcp.agent [None req-a6fbf417-a8a2-4aa1-b8bb-ab2260b7fd5f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:14:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:22.490 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:14:22 localhost ovn_controller[157396]: 2025-10-14T10:14:22Z|00142|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:14:22 localhost nova_compute[297686]: 2025-10-14 10:14:22.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:22 localhost nova_compute[297686]: 2025-10-14 10:14:22.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:23 localhost systemd[1]: run-netns-qdhcp\x2d704cdb9a\x2dd532\x2d4e6e\x2dad10\x2dbf557871ac2d.mount: Deactivated successfully. Oct 14 06:14:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:14:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:14:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:14:25 localhost nova_compute[297686]: 2025-10-14 10:14:25.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:25 localhost nova_compute[297686]: 2025-10-14 10:14:25.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:25 localhost nova_compute[297686]: 2025-10-14 10:14:25.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.344 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.345 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.345 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:14:26 localhost nova_compute[297686]: 2025-10-14 10:14:26.345 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:14:27 localhost nova_compute[297686]: 2025-10-14 10:14:27.139 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:14:27 localhost nova_compute[297686]: 2025-10-14 10:14:27.160 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:14:27 localhost nova_compute[297686]: 2025-10-14 10:14:27.160 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:14:27 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:27.622 2 INFO neutron.agent.securitygroups_rpc [None req-b77ddf35-4402-4af1-947b-e773082d3901 9cebd1ad9225424eb253dc6a7d396af9 96887d9c06a243c291a1dca4b8c2b18b - - default default] Security group member updated ['47646898-ac45-4242-8cca-db8d39176af7']#033[00m Oct 14 06:14:27 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:27.668 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:27Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=da7f454e-7253-4b58-93db-359e10806406, ip_allocation=immediate, mac_address=fa:16:3e:7d:1f:8b, name=tempest-RoutersAdminNegativeTest-391885298, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=True, project_id=96887d9c06a243c291a1dca4b8c2b18b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['47646898-ac45-4242-8cca-db8d39176af7'], standard_attr_id=1404, status=DOWN, tags=[], tenant_id=96887d9c06a243c291a1dca4b8c2b18b, updated_at=2025-10-14T10:14:27Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:14:27 localhost nova_compute[297686]: 2025-10-14 10:14:27.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:27 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:14:27 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:14:27 localhost podman[328490]: 2025-10-14 10:14:27.910760399 +0000 UTC m=+0.053345553 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 06:14:27 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:14:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:28.137 271987 INFO neutron.agent.dhcp.agent [None req-73b0234b-c2f1-4f95-9e3f-8ee8d065acc2 - - - - - -] DHCP configuration for ports {'da7f454e-7253-4b58-93db-359e10806406'} is completed#033[00m Oct 14 06:14:28 localhost nova_compute[297686]: 2025-10-14 10:14:28.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:28 localhost nova_compute[297686]: 2025-10-14 10:14:28.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:28 localhost nova_compute[297686]: 2025-10-14 10:14:28.259 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:14:28 localhost podman[248187]: time="2025-10-14T10:14:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:14:28 localhost podman[248187]: @ - - [14/Oct/2025:10:14:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:14:28 localhost podman[248187]: @ - - [14/Oct/2025:10:14:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19859 "" "Go-http-client/1.1" Oct 14 06:14:28 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:28.544 2 INFO neutron.agent.securitygroups_rpc [None req-ec768987-bbd6-4723-9163-14a7c5f70314 9cebd1ad9225424eb253dc6a7d396af9 96887d9c06a243c291a1dca4b8c2b18b - - default default] Security group member updated ['47646898-ac45-4242-8cca-db8d39176af7']#033[00m Oct 14 06:14:28 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:14:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:14:28 localhost podman[328528]: 2025-10-14 10:14:28.828782362 +0000 UTC m=+0.067491412 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:14:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:14:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:14:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:14:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:14:28 localhost systemd[1]: tmp-crun.kCdSeV.mount: Deactivated successfully. Oct 14 06:14:28 localhost podman[328543]: 2025-10-14 10:14:28.986001522 +0000 UTC m=+0.117260053 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:14:28 localhost podman[328542]: 2025-10-14 10:14:28.947400166 +0000 UTC m=+0.087029096 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:14:29 localhost podman[328542]: 2025-10-14 10:14:29.026594709 +0000 UTC m=+0.166223539 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009) Oct 14 06:14:29 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:14:29 localhost podman[328576]: 2025-10-14 10:14:29.045164955 +0000 UTC m=+0.086957355 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:14:29 localhost podman[328576]: 2025-10-14 10:14:29.054636368 +0000 UTC m=+0.096428758 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:14:29 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:14:29 localhost podman[328543]: 2025-10-14 10:14:29.074053059 +0000 UTC m=+0.205312100 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.license=GPLv2) Oct 14 06:14:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:14:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.286 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.287 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.309 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.310 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.310 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.310 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.311 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:14:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:14:30 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1683431900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.739 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.860 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:14:30 localhost nova_compute[297686]: 2025-10-14 10:14:30.860 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.083 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.085 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11243MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.085 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.085 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.202 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.202 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.203 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.264 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:14:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:14:31 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4101492730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.733 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.741 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.761 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.766 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:14:31 localhost nova_compute[297686]: 2025-10-14 10:14:31.767 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:14:32 localhost nova_compute[297686]: 2025-10-14 10:14:32.737 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:32 localhost nova_compute[297686]: 2025-10-14 10:14:32.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:32 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:32.977 2 INFO neutron.agent.securitygroups_rpc [None req-493a3d89-0790-46f5-867f-08353c1442dd 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:34 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:34.300 2 INFO neutron.agent.securitygroups_rpc [None req-ccc8d08d-5e27-4933-991c-cca7acd585e0 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:34 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:34.398 2 INFO neutron.agent.securitygroups_rpc [None req-ccc8d08d-5e27-4933-991c-cca7acd585e0 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:34 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:34.699 2 INFO neutron.agent.securitygroups_rpc [None req-c31bf8ce-e2f4-42b1-9816-f26162218d36 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:35 localhost nova_compute[297686]: 2025-10-14 10:14:35.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:35 localhost nova_compute[297686]: 2025-10-14 10:14:35.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:14:35 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:35.261 2 INFO neutron.agent.securitygroups_rpc [None req-af1630c5-9e5e-4607-9e63-1d078ad7f844 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:37 localhost nova_compute[297686]: 2025-10-14 10:14:37.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:38 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:38.571 2 INFO neutron.agent.securitygroups_rpc [None req-930c3719-0315-4802-aedb-d477db85cbdd 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:38 localhost openstack_network_exporter[250374]: ERROR 10:14:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:14:38 localhost openstack_network_exporter[250374]: ERROR 10:14:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:14:38 localhost openstack_network_exporter[250374]: ERROR 10:14:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:14:38 localhost openstack_network_exporter[250374]: ERROR 10:14:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:14:38 localhost openstack_network_exporter[250374]: Oct 14 06:14:38 localhost openstack_network_exporter[250374]: ERROR 10:14:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:14:38 localhost openstack_network_exporter[250374]: Oct 14 06:14:39 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:39.233 2 INFO neutron.agent.securitygroups_rpc [None req-1c8b7a71-aa4f-4514-bc27-ccf377daaaa0 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:40 localhost nova_compute[297686]: 2025-10-14 10:14:40.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:14:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:14:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:14:40 localhost podman[328656]: 2025-10-14 10:14:40.750967205 +0000 UTC m=+0.077961736 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:14:40 localhost podman[328656]: 2025-10-14 10:14:40.767060963 +0000 UTC m=+0.094055474 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, version=9.6) Oct 14 06:14:40 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:14:40 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:40.814 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:40Z, description=, device_id=e8bf66bd-02f5-4c5f-b2a3-44f4cc67ba50, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a2139300-599b-4258-9c5a-18368cb4df56, ip_allocation=immediate, mac_address=fa:16:3e:9b:86:48, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1489, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:14:40Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:14:40 localhost podman[328657]: 2025-10-14 10:14:40.81828691 +0000 UTC m=+0.141658478 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:14:40 localhost systemd[1]: tmp-crun.RUb1et.mount: Deactivated successfully. Oct 14 06:14:40 localhost podman[328655]: 2025-10-14 10:14:40.862769797 +0000 UTC m=+0.191139670 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:14:40 localhost podman[328657]: 2025-10-14 10:14:40.883236421 +0000 UTC m=+0.206607979 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:14:40 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:14:40 localhost podman[328655]: 2025-10-14 10:14:40.933169438 +0000 UTC m=+0.261539241 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller) Oct 14 06:14:40 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:14:41 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:14:41 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:14:41 localhost podman[328731]: 2025-10-14 10:14:41.049604265 +0000 UTC m=+0.057435911 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:14:41 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:14:41 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:41.344 271987 INFO neutron.agent.dhcp.agent [None req-58e78749-d450-4427-9e2e-7cb8e820ac00 - - - - - -] DHCP configuration for ports {'a2139300-599b-4258-9c5a-18368cb4df56'} is completed#033[00m Oct 14 06:14:43 localhost nova_compute[297686]: 2025-10-14 10:14:43.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:44.857 2 INFO neutron.agent.securitygroups_rpc [None req-087b4d3a-425d-4e9f-a312-99c3e3dbbf2b b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:45.283 271987 INFO neutron.agent.linux.ip_lib [None req-919f0472-878a-4c05-b2fe-2e85e0c329b1 - - - - - -] Device tap4b650fb3-c6 cannot be used as it has no MAC address#033[00m Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost kernel: device tap4b650fb3-c6 entered promiscuous mode Oct 14 06:14:45 localhost NetworkManager[5977]: [1760436885.3152] manager: (tap4b650fb3-c6): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Oct 14 06:14:45 localhost systemd-udevd[328761]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost ovn_controller[157396]: 2025-10-14T10:14:45Z|00143|binding|INFO|Claiming lport 4b650fb3-c6b4-440a-a92e-14d19513cdc0 for this chassis. Oct 14 06:14:45 localhost ovn_controller[157396]: 2025-10-14T10:14:45Z|00144|binding|INFO|4b650fb3-c6b4-440a-a92e-14d19513cdc0: Claiming unknown Oct 14 06:14:45 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:45.340 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-d3f378a7-479f-4d46-b0a6-dfc40fc2a677', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3f378a7-479f-4d46-b0a6-dfc40fc2a677', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '523b510232d6453589e91d54706d0036', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da478d7f-fb03-4a3e-a201-949626e55857, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4b650fb3-c6b4-440a-a92e-14d19513cdc0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:14:45 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:45.343 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 4b650fb3-c6b4-440a-a92e-14d19513cdc0 in datapath d3f378a7-479f-4d46-b0a6-dfc40fc2a677 bound to our chassis#033[00m Oct 14 06:14:45 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:45.345 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d3f378a7-479f-4d46-b0a6-dfc40fc2a677 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:45.346 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[e403acaf-0167-4327-b210-e43c55c7e585]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost ovn_controller[157396]: 2025-10-14T10:14:45Z|00145|binding|INFO|Setting lport 4b650fb3-c6b4-440a-a92e-14d19513cdc0 ovn-installed in OVS Oct 14 06:14:45 localhost ovn_controller[157396]: 2025-10-14T10:14:45Z|00146|binding|INFO|Setting lport 4b650fb3-c6b4-440a-a92e-14d19513cdc0 up in Southbound Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost journal[237477]: ethtool ioctl error on tap4b650fb3-c6: No such device Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost nova_compute[297686]: 2025-10-14 10:14:45.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:45 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:45.661 2 INFO neutron.agent.securitygroups_rpc [None req-9c4b57ab-1247-47ee-8b0d-b94e292a0e87 b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:14:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:46.281 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:46Z, description=, device_id=18f6f7f3-e902-49be-9587-f2d8a33844e0, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d3bc3340-8402-4a8c-8ea9-30f8ca05029a, ip_allocation=immediate, mac_address=fa:16:3e:43:3f:c8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1521, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:14:46Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:14:46 localhost podman[328832]: Oct 14 06:14:46 localhost podman[328832]: 2025-10-14 10:14:46.392536781 +0000 UTC m=+0.086955615 container create d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:14:46 localhost systemd[1]: Started libpod-conmon-d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef.scope. Oct 14 06:14:46 localhost podman[328832]: 2025-10-14 10:14:46.341546661 +0000 UTC m=+0.035965515 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:14:46 localhost systemd[1]: Started libcrun container. Oct 14 06:14:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e9479996f603ff3e34b71c17ea01243896aa9ce4778c865298cabd070d4c698/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:14:46 localhost podman[328832]: 2025-10-14 10:14:46.47452218 +0000 UTC m=+0.168940984 container init d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:14:46 localhost podman[328832]: 2025-10-14 10:14:46.486667106 +0000 UTC m=+0.181085900 container start d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:14:46 localhost dnsmasq[328878]: started, version 2.85 cachesize 150 Oct 14 06:14:46 localhost dnsmasq[328878]: DNS service limited to local subnets Oct 14 06:14:46 localhost dnsmasq[328878]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:14:46 localhost dnsmasq[328878]: warning: no upstream servers configured Oct 14 06:14:46 localhost dnsmasq-dhcp[328878]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:14:46 localhost dnsmasq[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/addn_hosts - 0 addresses Oct 14 06:14:46 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/host Oct 14 06:14:46 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/opts Oct 14 06:14:46 localhost podman[328865]: 2025-10-14 10:14:46.499175593 +0000 UTC m=+0.050968599 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS) Oct 14 06:14:46 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:14:46 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:14:46 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:14:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:46.633 271987 INFO neutron.agent.dhcp.agent [None req-98c56167-5c80-4083-a491-44a9f76ce1fe - - - - - -] DHCP configuration for ports {'0e47744b-364e-4162-89c0-32d04e05ab07'} is completed#033[00m Oct 14 06:14:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:46.791 271987 INFO neutron.agent.dhcp.agent [None req-0befd4c9-7ef5-46a1-b51e-f104ea316397 - - - - - -] DHCP configuration for ports {'d3bc3340-8402-4a8c-8ea9-30f8ca05029a'} is completed#033[00m Oct 14 06:14:47 localhost nova_compute[297686]: 2025-10-14 10:14:47.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:47 localhost systemd[1]: tmp-crun.MqCWzt.mount: Deactivated successfully. Oct 14 06:14:48 localhost nova_compute[297686]: 2025-10-14 10:14:48.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:48 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:48.075 2 INFO neutron.agent.securitygroups_rpc [None req-e913d1a3-5246-4050-a905-42a51970b588 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:48.203 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:47Z, description=, device_id=18f6f7f3-e902-49be-9587-f2d8a33844e0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a4ba8821-e1be-4e5c-87d2-5bc2468f4e8d, ip_allocation=immediate, mac_address=fa:16:3e:3e:84:be, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:42Z, description=, dns_domain=, id=d3f378a7-479f-4d46-b0a6-dfc40fc2a677, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-963730324-network, port_security_enabled=True, project_id=523b510232d6453589e91d54706d0036, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44435, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1503, status=ACTIVE, subnets=['56aa0981-fb58-41ae-8192-844865eb7564'], tags=[], tenant_id=523b510232d6453589e91d54706d0036, updated_at=2025-10-14T10:14:44Z, vlan_transparent=None, network_id=d3f378a7-479f-4d46-b0a6-dfc40fc2a677, port_security_enabled=False, project_id=523b510232d6453589e91d54706d0036, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1529, status=DOWN, tags=[], tenant_id=523b510232d6453589e91d54706d0036, updated_at=2025-10-14T10:14:47Z on network d3f378a7-479f-4d46-b0a6-dfc40fc2a677#033[00m Oct 14 06:14:48 localhost podman[328907]: 2025-10-14 10:14:48.439320795 +0000 UTC m=+0.064728736 container kill d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:14:48 localhost dnsmasq[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/addn_hosts - 1 addresses Oct 14 06:14:48 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/host Oct 14 06:14:48 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/opts Oct 14 06:14:48 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:48.764 271987 INFO neutron.agent.dhcp.agent [None req-18baaeee-0f9b-4e81-93d1-dff774e2c683 - - - - - -] DHCP configuration for ports {'a4ba8821-e1be-4e5c-87d2-5bc2468f4e8d'} is completed#033[00m Oct 14 06:14:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:14:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1056481191' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:14:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:14:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1056481191' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:14:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:49.634 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:47Z, description=, device_id=18f6f7f3-e902-49be-9587-f2d8a33844e0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a4ba8821-e1be-4e5c-87d2-5bc2468f4e8d, ip_allocation=immediate, mac_address=fa:16:3e:3e:84:be, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:42Z, description=, dns_domain=, id=d3f378a7-479f-4d46-b0a6-dfc40fc2a677, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-963730324-network, port_security_enabled=True, project_id=523b510232d6453589e91d54706d0036, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44435, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1503, status=ACTIVE, subnets=['56aa0981-fb58-41ae-8192-844865eb7564'], tags=[], tenant_id=523b510232d6453589e91d54706d0036, updated_at=2025-10-14T10:14:44Z, vlan_transparent=None, network_id=d3f378a7-479f-4d46-b0a6-dfc40fc2a677, port_security_enabled=False, project_id=523b510232d6453589e91d54706d0036, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1529, status=DOWN, tags=[], tenant_id=523b510232d6453589e91d54706d0036, updated_at=2025-10-14T10:14:47Z on network d3f378a7-479f-4d46-b0a6-dfc40fc2a677#033[00m Oct 14 06:14:49 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:49.645 2 INFO neutron.agent.securitygroups_rpc [None req-a9a6ddc4-4af7-41ae-9f40-e52cdf5d095d 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.821 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.822 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.846 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.847 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4ae9134-df73-4f38-a2f9-05e5b000b9da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.822886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e316dfa-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': 'c2a8e4ff4fc6f11c8babad012dba7e1693fd42b4130c2687df4e2953f793303d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.822886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e318448-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': 'e8459994ed96bc2217e79aab5e2445b181be864283c0c27a51fb5f73780e0764'}]}, 'timestamp': '2025-10-14 10:14:49.847777', '_unique_id': '80665f7cc44e4f439effc1738b04d2d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.863 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.864 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23aacaa2-b590-4a6c-be8c-a9dfeb96ebca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.851258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e340e98-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.044020005, 'message_signature': 'bcba9e7defbce3311ef580483310dbcd87fbfafe2dafa0eeb5cf850c6ec1db96'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.851258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e342496-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.044020005, 'message_signature': 'd74f153084304826098e0e5b7636961e57bbeb2ffce4b4632fc6501c31cd2df3'}]}, 'timestamp': '2025-10-14 10:14:49.864946', '_unique_id': '525d56661229484cbfd70bd179a0c248'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.866 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.867 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.872 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a806e47-100a-4bc3-8fa6-ffde02f5a8c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.868024', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3555dc-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': '3d685ba0e71036558ae194af9812dc134c15e1adfe5fd5abf016dd66dfbf5201'}]}, 'timestamp': '2025-10-14 10:14:49.872895', '_unique_id': 'fb2fb0a4b67c45659027f20138dca367'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.874 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.875 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.876 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2be1cd7c-1ce4-482e-b0d1-7e825e677cf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.875777', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e35df52-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': '4b78d66cd70dc5b1196273c240cbaf3314e16368d1c6310b4820f4f0cd78defd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.875777', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e35fc1c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': '1e26d98e0b2b43159024ad666163d5a0fbb521099321ef160fc30ff1eab9eaec'}]}, 'timestamp': '2025-10-14 10:14:49.877048', '_unique_id': 'f1cee6d4aa1f495c9221114cba9a3814'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '277ac876-3361-4e03-8932-cf380fab89bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.879279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e366706-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.044020005, 'message_signature': '23e1a18730c38b8f4d31447ec8deefc4001c3da9791802cd9d7c8d6c3b39cf6c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.879279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e367ba6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.044020005, 'message_signature': '5840984a71991cc333f3863de97799d41d63b503ece1aaf590c6664ff1e69fc4'}]}, 'timestamp': '2025-10-14 10:14:49.880267', '_unique_id': '80ffd705480b407380e8576f954c50c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.882 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.883 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00ff04c5-6a80-4f49-899d-ed2d0349a156', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.882577', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e36e9d8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': 'a0d68e3de1aa3ffef1c81f0719b85e7712dbf47efda0229f7f9873818325ca6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.882577', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e36fad6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': 'bd0b442eda0bb52bf7ac429e10f19fc446817798f274324b8631ab05ce14180d'}]}, 'timestamp': '2025-10-14 10:14:49.883516', '_unique_id': '416279ab76934997a37ccea3ac8ff40f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.887 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.888 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '881046cc-2ee7-4d74-b017-c7feb2f4d266', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.888270', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e37c718-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': 'f93b4db66d971f69727cc3116e1c2920970d5bd93611f4b9dd78dba4a3b183b5'}]}, 'timestamp': '2025-10-14 10:14:49.888861', '_unique_id': 'e31def44cc104cabb948818b5f5cded7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.890 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e418ff8b-e763-4467-b3f2-6df438bbfad6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.891096', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e3833d8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': '61799e655dfd0a4679407bad711b0f63f985f280f42baa017056688d382825d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.891096', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e38451c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': '185e468beed9ee52b17542e3ad0948d9a279a6e287b49a9b77ccd143c9b74baf'}]}, 'timestamp': '2025-10-14 10:14:49.892008', '_unique_id': '3639f747a531479c9ac565ca83a4483f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.892 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9e96c78-7362-4758-ac17-5054f0f9cece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.894323', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e38b1be-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': '04376822ae2065b2dc607e627e61b48dced868b284b85ec3f6abc1607c195a18'}]}, 'timestamp': '2025-10-14 10:14:49.894817', '_unique_id': '6efaafe6b9284a62877180524584357a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost dnsmasq[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/addn_hosts - 1 addresses Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/host Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/opts Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.895 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost systemd[1]: tmp-crun.2zct1C.mount: Deactivated successfully. Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f5aa65b-a702-403e-8ac8-3891daf9de68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.897072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e391cf8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': '156a2e90bae8b6fb193c82e5dabd6aa2b28ee7b416820db0523f9a79da922a0e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.897072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e392e28-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': 'f296c6d3bc3a10ddcb36808433d9d25b3018b92f2365eea4cdfd2eedbfbfda12'}]}, 'timestamp': '2025-10-14 10:14:49.897939', '_unique_id': 'b50c19a1b01a4573bd9223425fc0fbe2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b97a099-97c9-4eb7-a1d9-435cf7708365', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.900192', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e399818-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.044020005, 'message_signature': '4d01184b07a883dc60dc04df257426f6be0868fbc9a3a80694d929a154f82739'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.900192', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e39a966-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.044020005, 'message_signature': '845afa816896c1a654a48ee4350504fb0a64f2a8002c8366d1daed0def33ff1f'}]}, 'timestamp': '2025-10-14 10:14:49.901102', '_unique_id': '187d84cddb3c4102810462db80904194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.903 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1481d5a-e059-4c71-b40d-ce4fb9ea6128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.903238', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3a0e10-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': 'f552718e15af60476753fed75c1201bc579b62e4557fb82b78ab25e2e2222fba'}]}, 'timestamp': '2025-10-14 10:14:49.903735', '_unique_id': '0279bc8f31a6448c8d0c064af8624edb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.904 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.905 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost podman[328946]: 2025-10-14 10:14:49.90462584 +0000 UTC m=+0.084773597 container kill d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bccd3eb-37d3-48fd-869a-625e57a29912', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.905978', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3a7904-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': '23c1518a5f61ce253424d25ff1d59b829eaec9911866f650ed440cc1d9a23add'}]}, 'timestamp': '2025-10-14 10:14:49.906455', '_unique_id': '738e9aee05d047d287aa485b2d3c5999'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.907 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.908 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.926 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 16250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8ccca55-3b2d-4b61-8190-7f793a8018e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16250000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:14:49.908667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9e3dae44-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.119383299, 'message_signature': '42484bc3142abe49a5aef7689d2349ad99584131e99196c2aa76551488978c68'}]}, 'timestamp': '2025-10-14 10:14:49.927566', '_unique_id': '77c1c98572174d4193bdec53ee600c41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '170ce6e6-e29c-4aa2-a496-225d937b5115', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.930418', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3e3486-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': '542809c8589ababf236da44bfb54460c5469b0a70820982aafd955e00106ef1b'}]}, 'timestamp': '2025-10-14 10:14:49.930976', '_unique_id': '4db25fc2f5564d8bbc97f5827097aea5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c9678e8-cb0b-4357-b38f-18d12ad61124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.933161', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3e9f2a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': 'aa4d2b6ac397923ffde2cc56fd43f6c0293ddf578cdc34244ef03e91c66056b6'}]}, 'timestamp': '2025-10-14 10:14:49.933741', '_unique_id': 'd65dfc1aad1d4a3dbfcfd19ddb806c3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.934 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.936 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa1595ba-8ac6-401b-956d-02ddf20e378e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.935991', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3f0d52-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': '50a3de5d5ef22b60bcc0c4ee5f2454312cc0654e5e79b3b4935d2b64b5b6522d'}]}, 'timestamp': '2025-10-14 10:14:49.936450', '_unique_id': '4ea91f5d5c324a6e9c218a072569b4ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.939 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f80b529-04fc-4e8e-9660-6bb727ec241c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:14:49.938542', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e3f722e-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': 'e3b5af50e1281c34aea0592cb99db3a7d851f2e1226e41b67a45e50be7652d23'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:14:49.938542', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e3f8296-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.015600595, 'message_signature': '0f2b57c36f5f6ea52a4e0b1cc5926aceaec2ab277758a24e38638c8987cda0d5'}]}, 'timestamp': '2025-10-14 10:14:49.939419', '_unique_id': '1efd63e95e29450e872b39fd00286019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.940 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.941 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32ec729c-a58d-4d39-a321-5dbf4fa5073e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.941721', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e3fee0c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': '72c91031d894c2b252c090e2030a39f5d78551a605dcb7497fc14ff26039ab61'}]}, 'timestamp': '2025-10-14 10:14:49.942204', '_unique_id': 'fd6fd5652ffe447bb1418cbbdc245935'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.943 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.944 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.944 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10b47767-d948-4dd8-b342-7d0032cb599e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:14:49.944394', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '9e405298-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.060786784, 'message_signature': 'e0bb765e8e0053196c28213830afe8c650fadc4363cfc48be040457841078709'}]}, 'timestamp': '2025-10-14 10:14:49.944711', '_unique_id': '3afa746d800841da9e70723bb28d53b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.945 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9790c2a-5578-4148-b58c-5e4c41142db6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:14:49.946027', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9e409262-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 12906.119383299, 'message_signature': '93d6c3c2abdc5ac444b88384b8ea81832ee7ff40d5378e0a945f0d7d1b0b4774'}]}, 'timestamp': '2025-10-14 10:14:49.946311', '_unique_id': '76444362723e4048b5f7600253094262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:14:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:14:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 06:14:50 localhost nova_compute[297686]: 2025-10-14 10:14:50.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:50.278 271987 INFO neutron.agent.dhcp.agent [None req-a8523f09-a029-4570-b4f9-83d831f844b6 - - - - - -] DHCP configuration for ports {'a4ba8821-e1be-4e5c-87d2-5bc2468f4e8d'} is completed#033[00m Oct 14 06:14:51 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:51.479 2 INFO neutron.agent.securitygroups_rpc [None req-b7e36e4d-3b04-467d-a942-726901765f67 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:14:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:14:52 localhost podman[328969]: 2025-10-14 10:14:52.76647358 +0000 UTC m=+0.096488729 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Oct 14 06:14:52 localhost podman[328969]: 2025-10-14 10:14:52.775007874 +0000 UTC m=+0.105023053 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:14:52 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:14:52 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:52.851 2 INFO neutron.agent.securitygroups_rpc [None req-660da2bb-66db-4c5b-9cda-836400286a3e 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:52 localhost podman[328968]: 2025-10-14 10:14:52.864363582 +0000 UTC m=+0.195313540 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:14:52 localhost podman[328968]: 2025-10-14 10:14:52.876092256 +0000 UTC m=+0.207042264 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:14:52 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:14:53 localhost nova_compute[297686]: 2025-10-14 10:14:53.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:54.571 2 INFO neutron.agent.securitygroups_rpc [None req-567f1889-5571-45a9-97ac-31082d51ba82 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:54.817 2 INFO neutron.agent.securitygroups_rpc [None req-e928fd5e-95df-4c94-b4fd-7975f86e3b55 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:14:55 localhost nova_compute[297686]: 2025-10-14 10:14:55.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:56.145 2 INFO neutron.agent.securitygroups_rpc [None req-0d57410a-0cad-4e48-94db-70e77579a117 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:14:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:56.189 2 INFO neutron.agent.securitygroups_rpc [None req-1976456b-372a-4c63-9d45-b4047db9b826 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:14:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:57.783 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:14:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:57.784 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:14:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:14:57.785 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:14:58 localhost nova_compute[297686]: 2025-10-14 10:14:58.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:14:58 localhost podman[248187]: time="2025-10-14T10:14:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:14:58 localhost podman[248187]: @ - - [14/Oct/2025:10:14:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149321 "" "Go-http-client/1.1" Oct 14 06:14:58 localhost podman[248187]: @ - - [14/Oct/2025:10:14:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20327 "" "Go-http-client/1.1" Oct 14 06:14:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:58.568 2 INFO neutron.agent.securitygroups_rpc [None req-02e008f2-a233-4874-8613-ded4911feb59 b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:14:58 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:58.929 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:14:57Z, description=, device_id=6bbcb3d8-8bed-436a-b383-8ae8842802b2, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5d699238-9ac2-40a8-8b4d-dcc75e070714, ip_allocation=immediate, mac_address=fa:16:3e:4e:17:dc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1606, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:14:57Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:14:59 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:14:59 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:14:59 localhost podman[329026]: 2025-10-14 10:14:59.126158218 +0000 UTC m=+0.059023549 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:14:59 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:14:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:14:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:14:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:14:59 localhost podman[329039]: 2025-10-14 10:14:59.244970158 +0000 UTC m=+0.088768981 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Oct 14 06:14:59 localhost podman[329039]: 2025-10-14 10:14:59.259041044 +0000 UTC m=+0.102839867 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:14:59 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:14:59 localhost podman[329047]: 2025-10-14 10:14:59.301950722 +0000 UTC m=+0.135044273 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:14:59 localhost podman[329047]: 2025-10-14 10:14:59.313137359 +0000 UTC m=+0.146230860 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS) Oct 14 06:14:59 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:14:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:14:59 localhost systemd[1]: tmp-crun.ycIncM.mount: Deactivated successfully. Oct 14 06:14:59 localhost podman[329040]: 2025-10-14 10:14:59.362710625 +0000 UTC m=+0.196946161 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:14:59 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:14:59.363 271987 INFO neutron.agent.dhcp.agent [None req-d444c6b6-a33b-4665-b91c-a148cd0e8797 - - - - - -] DHCP configuration for ports {'5d699238-9ac2-40a8-8b4d-dcc75e070714'} is completed#033[00m Oct 14 06:14:59 localhost podman[329040]: 2025-10-14 10:14:59.403123756 +0000 UTC m=+0.237359282 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:14:59 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:14:59 localhost neutron_sriov_agent[264974]: 2025-10-14 10:14:59.452 2 INFO neutron.agent.securitygroups_rpc [None req-6190d315-fba4-49e4-99ef-d93a305a28b6 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:00.185 271987 INFO neutron.agent.linux.ip_lib [None req-962eb2bd-68e5-4f6b-b92f-c44cdc5df703 - - - - - -] Device tapbb0ea87f-a2 cannot be used as it has no MAC address#033[00m Oct 14 06:15:00 localhost nova_compute[297686]: 2025-10-14 10:15:00.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:00 localhost kernel: device tapbb0ea87f-a2 entered promiscuous mode Oct 14 06:15:00 localhost NetworkManager[5977]: [1760436900.2212] manager: (tapbb0ea87f-a2): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Oct 14 06:15:00 localhost ovn_controller[157396]: 2025-10-14T10:15:00Z|00147|binding|INFO|Claiming lport bb0ea87f-a209-4f63-9311-afd72effdbf4 for this chassis. Oct 14 06:15:00 localhost ovn_controller[157396]: 2025-10-14T10:15:00Z|00148|binding|INFO|bb0ea87f-a209-4f63-9311-afd72effdbf4: Claiming unknown Oct 14 06:15:00 localhost systemd-udevd[329116]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:15:00 localhost nova_compute[297686]: 2025-10-14 10:15:00.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:00 localhost nova_compute[297686]: 2025-10-14 10:15:00.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:00 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:00.245 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a840994a70374548889747682f4c0fa3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8e4911e-185b-4913-9de9-7434dfd2cae7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bb0ea87f-a209-4f63-9311-afd72effdbf4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:00 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:00.247 163055 INFO neutron.agent.ovn.metadata.agent [-] Port bb0ea87f-a209-4f63-9311-afd72effdbf4 in datapath 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0 bound to our chassis#033[00m Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:00.249 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:15:00 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:00.251 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[4f0846a4-0b1c-4674-a0c3-294e66c4f0f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost ovn_controller[157396]: 2025-10-14T10:15:00Z|00149|binding|INFO|Setting lport bb0ea87f-a209-4f63-9311-afd72effdbf4 ovn-installed in OVS Oct 14 06:15:00 localhost ovn_controller[157396]: 2025-10-14T10:15:00Z|00150|binding|INFO|Setting lport bb0ea87f-a209-4f63-9311-afd72effdbf4 up in Southbound Oct 14 06:15:00 localhost nova_compute[297686]: 2025-10-14 10:15:00.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost journal[237477]: ethtool ioctl error on tapbb0ea87f-a2: No such device Oct 14 06:15:00 localhost nova_compute[297686]: 2025-10-14 10:15:00.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:00 localhost nova_compute[297686]: 2025-10-14 10:15:00.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:01 localhost podman[329187]: Oct 14 06:15:01 localhost podman[329187]: 2025-10-14 10:15:01.066924919 +0000 UTC m=+0.078700829 container create 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:15:01 localhost systemd[1]: Started libpod-conmon-403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18.scope. Oct 14 06:15:01 localhost systemd[1]: Started libcrun container. Oct 14 06:15:01 localhost podman[329187]: 2025-10-14 10:15:01.033523174 +0000 UTC m=+0.045299054 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:15:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf1c73937df2b64a9b269436c3c7027707915b7d487f8d6bf2c6f349b2191277/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:15:01 localhost podman[329187]: 2025-10-14 10:15:01.144167522 +0000 UTC m=+0.155943442 container init 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:15:01 localhost podman[329187]: 2025-10-14 10:15:01.153919763 +0000 UTC m=+0.165695673 container start 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:15:01 localhost dnsmasq[329204]: started, version 2.85 cachesize 150 Oct 14 06:15:01 localhost dnsmasq[329204]: DNS service limited to local subnets Oct 14 06:15:01 localhost dnsmasq[329204]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:15:01 localhost dnsmasq[329204]: warning: no upstream servers configured Oct 14 06:15:01 localhost dnsmasq-dhcp[329204]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:15:01 localhost dnsmasq[329204]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 0 addresses Oct 14 06:15:01 localhost dnsmasq-dhcp[329204]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:01 localhost dnsmasq-dhcp[329204]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:01.348 271987 INFO neutron.agent.dhcp.agent [None req-26cb3820-6eb6-4332-ae01-9780e54d6a1b - - - - - -] DHCP configuration for ports {'35ebd5dd-acef-4403-863f-aafed124cfd9'} is completed#033[00m Oct 14 06:15:01 localhost dnsmasq[329204]: exiting on receipt of SIGTERM Oct 14 06:15:01 localhost podman[329223]: 2025-10-14 10:15:01.590912808 +0000 UTC m=+0.067941035 container kill 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:15:01 localhost systemd[1]: tmp-crun.aWp1BJ.mount: Deactivated successfully. Oct 14 06:15:01 localhost systemd[1]: libpod-403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18.scope: Deactivated successfully. Oct 14 06:15:01 localhost podman[329235]: 2025-10-14 10:15:01.64230241 +0000 UTC m=+0.041985581 container died 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:01 localhost podman[329235]: 2025-10-14 10:15:01.671655409 +0000 UTC m=+0.071338580 container cleanup 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:15:01 localhost systemd[1]: libpod-conmon-403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18.scope: Deactivated successfully. Oct 14 06:15:01 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:01.689 2 INFO neutron.agent.securitygroups_rpc [None req-da490536-bf82-49ef-911b-729ed0c16fa6 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:01 localhost podman[329242]: 2025-10-14 10:15:01.742901916 +0000 UTC m=+0.128485760 container remove 403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:15:01 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:01.933 2 INFO neutron.agent.securitygroups_rpc [None req-b3b606fd-fef6-4f13-b26d-debe6e3790e2 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:02 localhost systemd[1]: var-lib-containers-storage-overlay-cf1c73937df2b64a9b269436c3c7027707915b7d487f8d6bf2c6f349b2191277-merged.mount: Deactivated successfully. Oct 14 06:15:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-403cc1dee880c6c7654336636c6356f5bc4a6111b053c22dd7b1f39e87dcda18-userdata-shm.mount: Deactivated successfully. Oct 14 06:15:02 localhost nova_compute[297686]: 2025-10-14 10:15:02.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:03 localhost nova_compute[297686]: 2025-10-14 10:15:03.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:03 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:03.273 2 INFO neutron.agent.securitygroups_rpc [None req-9387dae5-72b9-4186-9a6c-a7f3a1f00f20 476187b4066141bb9d0e00e94ed7295c 7bf1be3a6a454996a4414fad306906f1 - - default default] Security group member updated ['a0f73c72-581b-41a5-a47e-a3f1b6149df7']#033[00m Oct 14 06:15:03 localhost podman[329316]: Oct 14 06:15:03 localhost podman[329316]: 2025-10-14 10:15:03.550153122 +0000 UTC m=+0.082207898 container create d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:15:03 localhost systemd[1]: Started libpod-conmon-d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df.scope. Oct 14 06:15:03 localhost systemd[1]: Started libcrun container. Oct 14 06:15:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edac1287a8d71f4e279a7cb3db6fec57c57593c7754809a15ff553748226f01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:15:03 localhost podman[329316]: 2025-10-14 10:15:03.614283988 +0000 UTC m=+0.146338794 container init d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:03 localhost podman[329316]: 2025-10-14 10:15:03.515290542 +0000 UTC m=+0.047345378 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:15:03 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:03.620 2 INFO neutron.agent.securitygroups_rpc [None req-28321eb3-6e5c-4c40-82a9-856981146355 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:03 localhost podman[329316]: 2025-10-14 10:15:03.625595619 +0000 UTC m=+0.157650425 container start d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:15:03 localhost dnsmasq[329336]: started, version 2.85 cachesize 150 Oct 14 06:15:03 localhost dnsmasq[329336]: DNS service limited to local subnets Oct 14 06:15:03 localhost dnsmasq[329336]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:15:03 localhost dnsmasq[329336]: warning: no upstream servers configured Oct 14 06:15:03 localhost dnsmasq-dhcp[329336]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Oct 14 06:15:03 localhost dnsmasq-dhcp[329336]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:15:03 localhost dnsmasq[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 0 addresses Oct 14 06:15:03 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:03 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:03 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:03.682 271987 INFO neutron.agent.dhcp.agent [None req-899129a5-bb80-4296-b300-9e70f40a22d2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:01Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=7d94a934-8333-4570-bef0-18cceaa0bad0, ip_allocation=immediate, mac_address=fa:16:3e:19:64:40, name=tempest-PortsIpV6TestJSON-1477107513, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:14:56Z, description=, dns_domain=, id=6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-905463279, port_security_enabled=True, project_id=a840994a70374548889747682f4c0fa3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14109, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['1afb542b-77b6-4c73-86e9-34b124c227f3', '41a71247-d739-4a1e-88b3-b949087391d1'], tags=[], tenant_id=a840994a70374548889747682f4c0fa3, updated_at=2025-10-14T10:15:01Z, vlan_transparent=None, network_id=6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, port_security_enabled=True, project_id=a840994a70374548889747682f4c0fa3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['59283390-a499-4358-9f49-155fd8075ea9'], standard_attr_id=1621, status=DOWN, tags=[], tenant_id=a840994a70374548889747682f4c0fa3, updated_at=2025-10-14T10:15:01Z on network 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0#033[00m Oct 14 06:15:03 localhost dnsmasq[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 2 addresses Oct 14 06:15:03 localhost podman[329355]: 2025-10-14 10:15:03.869276846 +0000 UTC m=+0.051829007 container kill d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:15:03 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:03 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:03 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:03.931 271987 INFO neutron.agent.dhcp.agent [None req-4726d476-8592-4539-824f-f272c118e7bd - - - - - -] DHCP configuration for ports {'35ebd5dd-acef-4403-863f-aafed124cfd9', 'bb0ea87f-a209-4f63-9311-afd72effdbf4'} is completed#033[00m Oct 14 06:15:04 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:04.023 271987 INFO neutron.agent.dhcp.agent [None req-899129a5-bb80-4296-b300-9e70f40a22d2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:01Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7d94a934-8333-4570-bef0-18cceaa0bad0, ip_allocation=immediate, mac_address=fa:16:3e:19:64:40, name=tempest-PortsIpV6TestJSON-1477107513, network_id=6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, port_security_enabled=True, project_id=a840994a70374548889747682f4c0fa3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['59283390-a499-4358-9f49-155fd8075ea9'], standard_attr_id=1621, status=DOWN, tags=[], tenant_id=a840994a70374548889747682f4c0fa3, updated_at=2025-10-14T10:15:03Z on network 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0#033[00m Oct 14 06:15:04 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:04.114 271987 INFO neutron.agent.dhcp.agent [None req-bd01e8f4-a3f5-4252-99c6-4e2ff0916a6e - - - - - -] DHCP configuration for ports {'7d94a934-8333-4570-bef0-18cceaa0bad0'} is completed#033[00m Oct 14 06:15:04 localhost dnsmasq[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 1 addresses Oct 14 06:15:04 localhost podman[329393]: 2025-10-14 10:15:04.210107362 +0000 UTC m=+0.059043410 container kill d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:15:04 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:04 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:04.373 2 INFO neutron.agent.securitygroups_rpc [None req-19708c1d-1742-44a2-b489-e8ac0413b03a b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:04 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:04.483 271987 INFO neutron.agent.dhcp.agent [None req-6f2e7b36-69be-46d4-bafe-e964319e7ffc - - - - - -] DHCP configuration for ports {'7d94a934-8333-4570-bef0-18cceaa0bad0'} is completed#033[00m Oct 14 06:15:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:04.689 2 INFO neutron.agent.securitygroups_rpc [None req-89ed6019-5c44-4ebe-ad04-5bc0e8cf21a8 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:04 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:04.745 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:01Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=7d94a934-8333-4570-bef0-18cceaa0bad0, ip_allocation=immediate, mac_address=fa:16:3e:19:64:40, name=tempest-PortsIpV6TestJSON-1477107513, network_id=6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, port_security_enabled=True, project_id=a840994a70374548889747682f4c0fa3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['59283390-a499-4358-9f49-155fd8075ea9'], standard_attr_id=1621, status=DOWN, tags=[], tenant_id=a840994a70374548889747682f4c0fa3, updated_at=2025-10-14T10:15:04Z on network 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0#033[00m Oct 14 06:15:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:04.783 2 INFO neutron.agent.securitygroups_rpc [None req-a01b87f3-ce06-413b-afd6-6cfe8241bf52 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:04 localhost dnsmasq[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 2 addresses Oct 14 06:15:04 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:04 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:04 localhost systemd[1]: tmp-crun.BnB6Z2.mount: Deactivated successfully. Oct 14 06:15:04 localhost podman[329432]: 2025-10-14 10:15:04.942846837 +0000 UTC m=+0.062360122 container kill d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:15:05 localhost nova_compute[297686]: 2025-10-14 10:15:05.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:05 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:05.239 271987 INFO neutron.agent.dhcp.agent [None req-8c512b11-029d-466a-8c7c-8b27f32ff98a - - - - - -] DHCP configuration for ports {'7d94a934-8333-4570-bef0-18cceaa0bad0'} is completed#033[00m Oct 14 06:15:06 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:06.087 2 INFO neutron.agent.securitygroups_rpc [None req-9eac4a9e-ddab-4fe7-9be2-7771638459ee 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:06 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:06.125 2 INFO neutron.agent.securitygroups_rpc [None req-de093b06-e4ad-4bac-9ddf-80da81ca26d6 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:06 localhost dnsmasq[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 0 addresses Oct 14 06:15:06 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:06 localhost dnsmasq-dhcp[329336]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:06 localhost podman[329469]: 2025-10-14 10:15:06.346770761 +0000 UTC m=+0.068928596 container kill d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:15:06 localhost systemd[1]: tmp-crun.jCJ7tY.mount: Deactivated successfully. Oct 14 06:15:07 localhost dnsmasq[329336]: exiting on receipt of SIGTERM Oct 14 06:15:07 localhost podman[329506]: 2025-10-14 10:15:07.761344604 +0000 UTC m=+0.061748203 container kill d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:15:07 localhost systemd[1]: libpod-d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df.scope: Deactivated successfully. Oct 14 06:15:07 localhost podman[329520]: 2025-10-14 10:15:07.812126107 +0000 UTC m=+0.037200193 container died d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:15:07 localhost systemd[1]: tmp-crun.vo6otK.mount: Deactivated successfully. Oct 14 06:15:07 localhost podman[329520]: 2025-10-14 10:15:07.853185119 +0000 UTC m=+0.078259165 container cleanup d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:15:07 localhost systemd[1]: libpod-conmon-d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df.scope: Deactivated successfully. Oct 14 06:15:07 localhost podman[329521]: 2025-10-14 10:15:07.876426209 +0000 UTC m=+0.094198939 container remove d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:08 localhost nova_compute[297686]: 2025-10-14 10:15:08.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:08 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:08.655 2 INFO neutron.agent.securitygroups_rpc [None req-2ac4edd3-ef06-4603-af78-165f86524b6c b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:08 localhost podman[329599]: Oct 14 06:15:08 localhost podman[329599]: 2025-10-14 10:15:08.744090423 +0000 UTC m=+0.086211711 container create c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:15:08 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:08.752 2 INFO neutron.agent.securitygroups_rpc [None req-6ff06281-5378-4836-8ecf-7135bb6770f8 476187b4066141bb9d0e00e94ed7295c 7bf1be3a6a454996a4414fad306906f1 - - default default] Security group member updated ['a0f73c72-581b-41a5-a47e-a3f1b6149df7']#033[00m Oct 14 06:15:08 localhost systemd[1]: var-lib-containers-storage-overlay-0edac1287a8d71f4e279a7cb3db6fec57c57593c7754809a15ff553748226f01-merged.mount: Deactivated successfully. Oct 14 06:15:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5699aa08ab4d0628501dc3a60388b19f024879394e351481de2c274d0c2a8df-userdata-shm.mount: Deactivated successfully. Oct 14 06:15:08 localhost openstack_network_exporter[250374]: ERROR 10:15:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:15:08 localhost openstack_network_exporter[250374]: ERROR 10:15:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:15:08 localhost openstack_network_exporter[250374]: ERROR 10:15:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:15:08 localhost openstack_network_exporter[250374]: ERROR 10:15:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:15:08 localhost openstack_network_exporter[250374]: Oct 14 06:15:08 localhost openstack_network_exporter[250374]: ERROR 10:15:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:15:08 localhost openstack_network_exporter[250374]: Oct 14 06:15:08 localhost systemd[1]: Started libpod-conmon-c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0.scope. Oct 14 06:15:08 localhost systemd[1]: tmp-crun.sxf1b2.mount: Deactivated successfully. Oct 14 06:15:08 localhost podman[329599]: 2025-10-14 10:15:08.70592414 +0000 UTC m=+0.048045428 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:15:08 localhost systemd[1]: Started libcrun container. Oct 14 06:15:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ac353d38321e642e73b64f3518e2250de7feb23b0c397a3f5f830eeb520e328/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:15:08 localhost podman[329599]: 2025-10-14 10:15:08.826904617 +0000 UTC m=+0.169025905 container init c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:08 localhost podman[329599]: 2025-10-14 10:15:08.837446204 +0000 UTC m=+0.179567492 container start c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:15:08 localhost dnsmasq[329618]: started, version 2.85 cachesize 150 Oct 14 06:15:08 localhost dnsmasq[329618]: DNS service limited to local subnets Oct 14 06:15:08 localhost dnsmasq[329618]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:15:08 localhost dnsmasq[329618]: warning: no upstream servers configured Oct 14 06:15:08 localhost dnsmasq-dhcp[329618]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:15:08 localhost dnsmasq[329618]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/addn_hosts - 0 addresses Oct 14 06:15:08 localhost dnsmasq-dhcp[329618]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/host Oct 14 06:15:08 localhost dnsmasq-dhcp[329618]: read /var/lib/neutron/dhcp/6fe657c6-cd0c-4c3b-b5f6-71455139f6b0/opts Oct 14 06:15:08 localhost ovn_controller[157396]: 2025-10-14T10:15:08Z|00151|binding|INFO|Removing iface tapbb0ea87f-a2 ovn-installed in OVS Oct 14 06:15:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:08.853 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 33457956-f757-4524-b331-924c5414cf96 with type ""#033[00m Oct 14 06:15:08 localhost ovn_controller[157396]: 2025-10-14T10:15:08Z|00152|binding|INFO|Removing lport bb0ea87f-a209-4f63-9311-afd72effdbf4 ovn-installed in OVS Oct 14 06:15:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:08.855 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a840994a70374548889747682f4c0fa3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8e4911e-185b-4913-9de9-7434dfd2cae7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bb0ea87f-a209-4f63-9311-afd72effdbf4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:08 localhost nova_compute[297686]: 2025-10-14 10:15:08.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:08.858 163055 INFO neutron.agent.ovn.metadata.agent [-] Port bb0ea87f-a209-4f63-9311-afd72effdbf4 in datapath 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0 unbound from our chassis#033[00m Oct 14 06:15:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:08.860 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6fe657c6-cd0c-4c3b-b5f6-71455139f6b0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:15:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:08.862 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[d11f199e-27d5-4950-9653-905dd8e82e18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:08 localhost nova_compute[297686]: 2025-10-14 10:15:08.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:08.998 271987 INFO neutron.agent.dhcp.agent [None req-b20e4107-13d3-436d-b66f-29a0936757b7 - - - - - -] DHCP configuration for ports {'35ebd5dd-acef-4403-863f-aafed124cfd9', 'bb0ea87f-a209-4f63-9311-afd72effdbf4'} is completed#033[00m Oct 14 06:15:09 localhost dnsmasq[329618]: exiting on receipt of SIGTERM Oct 14 06:15:09 localhost podman[329634]: 2025-10-14 10:15:09.155892178 +0000 UTC m=+0.064579281 container kill c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:15:09 localhost systemd[1]: libpod-c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0.scope: Deactivated successfully. Oct 14 06:15:09 localhost podman[329650]: 2025-10-14 10:15:09.237142384 +0000 UTC m=+0.055324625 container died c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:15:09 localhost podman[329650]: 2025-10-14 10:15:09.279359572 +0000 UTC m=+0.097541793 container remove c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6fe657c6-cd0c-4c3b-b5f6-71455139f6b0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:15:09 localhost systemd[1]: libpod-conmon-c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0.scope: Deactivated successfully. Oct 14 06:15:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:09 localhost kernel: device tapbb0ea87f-a2 left promiscuous mode Oct 14 06:15:09 localhost nova_compute[297686]: 2025-10-14 10:15:09.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:09 localhost nova_compute[297686]: 2025-10-14 10:15:09.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:09.404 271987 INFO neutron.agent.dhcp.agent [None req-edbc6225-b92a-47e7-9aa8-e428770f8d2a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:09.608 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:09 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:09.615 2 INFO neutron.agent.securitygroups_rpc [None req-33756dd7-b8a3-4f2a-bb6b-acf9d13d094f 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:09 localhost systemd[1]: var-lib-containers-storage-overlay-2ac353d38321e642e73b64f3518e2250de7feb23b0c397a3f5f830eeb520e328-merged.mount: Deactivated successfully. Oct 14 06:15:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6a1bcab99992661e0a41006683bc30fd93f3b7067b4afebaef004f328a7afa0-userdata-shm.mount: Deactivated successfully. Oct 14 06:15:09 localhost systemd[1]: run-netns-qdhcp\x2d6fe657c6\x2dcd0c\x2d4c3b\x2db5f6\x2d71455139f6b0.mount: Deactivated successfully. Oct 14 06:15:10 localhost nova_compute[297686]: 2025-10-14 10:15:10.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:10 localhost ovn_controller[157396]: 2025-10-14T10:15:10Z|00153|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:15:10 localhost nova_compute[297686]: 2025-10-14 10:15:10.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:10 localhost nova_compute[297686]: 2025-10-14 10:15:10.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:11 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:11.029 2 INFO neutron.agent.securitygroups_rpc [None req-1db2801f-7f14-475b-9034-d870e9398832 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:11 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:11.613 2 INFO neutron.agent.securitygroups_rpc [None req-b3891885-72c1-4c57-baf6-f44e7a046970 b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:15:11 localhost podman[329679]: 2025-10-14 10:15:11.744592918 +0000 UTC m=+0.085715907 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible) Oct 14 06:15:11 localhost podman[329679]: 2025-10-14 10:15:11.827232707 +0000 UTC m=+0.168355746 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:11 localhost systemd[1]: tmp-crun.fLaHJ0.mount: Deactivated successfully. Oct 14 06:15:11 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:15:11 localhost podman[329681]: 2025-10-14 10:15:11.84057687 +0000 UTC m=+0.173208066 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, managed_by=edpm_ansible) Oct 14 06:15:11 localhost podman[329681]: 2025-10-14 10:15:11.850000803 +0000 UTC m=+0.182631989 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ceilometer_agent_compute) Oct 14 06:15:11 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:15:11 localhost podman[329680]: 2025-10-14 10:15:11.890868898 +0000 UTC m=+0.229421307 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Oct 14 06:15:11 localhost podman[329680]: 2025-10-14 10:15:11.92352486 +0000 UTC m=+0.262077109 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=) Oct 14 06:15:11 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:15:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:13.006 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:13.007 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:15:13 localhost nova_compute[297686]: 2025-10-14 10:15:13.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:13 localhost nova_compute[297686]: 2025-10-14 10:15:13.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:13 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e131 e131: 6 total, 6 up, 6 in Oct 14 06:15:13 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:13.433 2 INFO neutron.agent.securitygroups_rpc [None req-c2f5f097-bba7-437f-a590-36885716465b 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:13 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:13.806 2 INFO neutron.agent.securitygroups_rpc [None req-cd2f7380-4784-4a84-ad74-fc6a911c5024 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:14.078 2 INFO neutron.agent.securitygroups_rpc [None req-434a8d80-a33b-48b3-84be-9de6425cb325 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:14.363 2 INFO neutron.agent.securitygroups_rpc [None req-204ba6d8-4885-47e0-aefa-27975646d087 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e132 e132: 6 total, 6 up, 6 in Oct 14 06:15:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:15.235 2 INFO neutron.agent.securitygroups_rpc [None req-091340ae-b286-4614-8179-3bb6eeb30cfa 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:15 localhost nova_compute[297686]: 2025-10-14 10:15:15.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:15.854 2 INFO neutron.agent.securitygroups_rpc [None req-22ce8ae9-860b-42ef-93b2-044fcbfa7c60 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:15.909 2 INFO neutron.agent.securitygroups_rpc [None req-c06e5ff4-6a19-4ed9-a361-5ee68eed5c1d 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:16 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:15:16 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1762252878' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:15:16 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:15:16 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1762252878' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:15:16 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:16.905 2 INFO neutron.agent.securitygroups_rpc [None req-a4deacba-ad68-44d1-8932-5e1de4d92966 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:18 localhost nova_compute[297686]: 2025-10-14 10:15:18.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:15:19 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3769006567' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:15:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:15:19 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3769006567' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:15:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:19.606 2 INFO neutron.agent.securitygroups_rpc [None req-2e0a1545-8dce-4395-9eda-96b0915ada53 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:19.668 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:19Z, description=, device_id=c7ca6471-9d82-498f-a2bf-8ebb0001668a, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=49e64c4a-dce7-4637-8797-621976810eab, ip_allocation=immediate, mac_address=fa:16:3e:2a:48:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1717, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:15:19Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:15:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 e133: 6 total, 6 up, 6 in Oct 14 06:15:19 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 5 addresses Oct 14 06:15:19 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:19 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:19 localhost podman[329761]: 2025-10-14 10:15:19.868266962 +0000 UTC m=+0.053624592 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Oct 14 06:15:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:20.195 271987 INFO neutron.agent.dhcp.agent [None req-878bd311-c5ce-4be8-b6fd-8ef354d5f366 - - - - - -] DHCP configuration for ports {'49e64c4a-dce7-4637-8797-621976810eab'} is completed#033[00m Oct 14 06:15:20 localhost nova_compute[297686]: 2025-10-14 10:15:20.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:20 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:20.881 2 INFO neutron.agent.securitygroups_rpc [None req-36685efb-a3c2-46cc-8647-4994e9dc66ed 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:22.008 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:15:22 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:22.049 2 INFO neutron.agent.securitygroups_rpc [None req-c443f879-fd39-41c0-b2ca-b90cd4f8dede 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:23 localhost nova_compute[297686]: 2025-10-14 10:15:23.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:15:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:15:23 localhost systemd[1]: tmp-crun.D4j8ll.mount: Deactivated successfully. Oct 14 06:15:23 localhost podman[329799]: 2025-10-14 10:15:23.735349276 +0000 UTC m=+0.100904526 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:15:23 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:23.742 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:22Z, description=, device_id=f5fa89e3-0e1b-479a-8d14-1f17eeb1b2e7, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=352cfb2a-d847-489b-bb0f-1cd26704bf96, ip_allocation=immediate, mac_address=fa:16:3e:42:b1:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1741, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:15:23Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:15:23 localhost podman[329799]: 2025-10-14 10:15:23.752347643 +0000 UTC m=+0.117902923 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:15:23 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:15:23 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:23.808 2 INFO neutron.agent.securitygroups_rpc [None req-4115f1a8-5df7-4a09-a802-bec7f8a3f06c b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:23 localhost podman[329800]: 2025-10-14 10:15:23.824359093 +0000 UTC m=+0.189456019 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:15:23 localhost podman[329800]: 2025-10-14 10:15:23.908015085 +0000 UTC m=+0.273112031 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Oct 14 06:15:23 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:15:23 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:15:23 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:23 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:23 localhost podman[329876]: 2025-10-14 10:15:23.973821662 +0000 UTC m=+0.095442157 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:15:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:24.204 271987 INFO neutron.agent.dhcp.agent [None req-a3d29d03-d22d-4113-a2d3-a91dacb07da5 - - - - - -] DHCP configuration for ports {'352cfb2a-d847-489b-bb0f-1cd26704bf96'} is completed#033[00m Oct 14 06:15:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:15:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:15:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:24.567 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:23Z, description=, device_id=e4087971-46b9-47a9-bed6-ec82f44e073a, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4ad96f14-eaf9-442d-bc31-bc3cb6fa4cc8, ip_allocation=immediate, mac_address=fa:16:3e:52:33:cf, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1745, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:15:24Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:15:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:15:24 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4110961416' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:15:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:15:24 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4110961416' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:15:24 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 7 addresses Oct 14 06:15:24 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:24 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:24 localhost podman[329949]: 2025-10-14 10:15:24.801138537 +0000 UTC m=+0.059638508 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:15:25 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:25.091 271987 INFO neutron.agent.dhcp.agent [None req-ca60d172-b04d-47f4-acd7-d273186a5db7 - - - - - -] DHCP configuration for ports {'4ad96f14-eaf9-442d-bc31-bc3cb6fa4cc8'} is completed#033[00m Oct 14 06:15:25 localhost nova_compute[297686]: 2025-10-14 10:15:25.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:25 localhost nova_compute[297686]: 2025-10-14 10:15:25.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:25.320 2 INFO neutron.agent.securitygroups_rpc [None req-0c22be32-228c-44ef-8a0f-96c9a8ac58ba b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.417 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.417 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.417 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:15:27 localhost nova_compute[297686]: 2025-10-14 10:15:27.418 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:15:28 localhost podman[248187]: time="2025-10-14T10:15:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:15:28 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:28.327 2 INFO neutron.agent.securitygroups_rpc [None req-faa6c76b-98b5-4cd2-b87b-72ca8f02394e 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:28 localhost podman[248187]: @ - - [14/Oct/2025:10:15:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149321 "" "Go-http-client/1.1" Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:28 localhost podman[248187]: @ - - [14/Oct/2025:10:15:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20336 "" "Go-http-client/1.1" Oct 14 06:15:28 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 6 addresses Oct 14 06:15:28 localhost podman[329995]: 2025-10-14 10:15:28.683840645 +0000 UTC m=+0.060780954 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:15:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:28 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.869 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.889 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.889 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.890 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.891 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:28 localhost nova_compute[297686]: 2025-10-14 10:15:28.891 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:15:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:15:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:15:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:15:29 localhost podman[330017]: 2025-10-14 10:15:29.761936886 +0000 UTC m=+0.095650413 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:15:29 localhost podman[330017]: 2025-10-14 10:15:29.773399702 +0000 UTC m=+0.107113149 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:15:29 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:15:29 localhost podman[330019]: 2025-10-14 10:15:29.830399427 +0000 UTC m=+0.154916069 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:15:29 localhost dnsmasq[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/addn_hosts - 0 addresses Oct 14 06:15:29 localhost podman[330083]: 2025-10-14 10:15:29.932922323 +0000 UTC m=+0.069493134 container kill d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:15:29 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/host Oct 14 06:15:29 localhost dnsmasq-dhcp[328878]: read /var/lib/neutron/dhcp/d3f378a7-479f-4d46-b0a6-dfc40fc2a677/opts Oct 14 06:15:29 localhost podman[330019]: 2025-10-14 10:15:29.966150221 +0000 UTC m=+0.290666803 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:15:29 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:15:30 localhost podman[330018]: 2025-10-14 10:15:29.915822262 +0000 UTC m=+0.244214194 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:15:30 localhost podman[330018]: 2025-10-14 10:15:30.050568207 +0000 UTC m=+0.378960129 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:15:30 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:15:30 localhost nova_compute[297686]: 2025-10-14 10:15:30.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:30 localhost kernel: device tap4b650fb3-c6 left promiscuous mode Oct 14 06:15:30 localhost ovn_controller[157396]: 2025-10-14T10:15:30Z|00154|binding|INFO|Releasing lport 4b650fb3-c6b4-440a-a92e-14d19513cdc0 from this chassis (sb_readonly=0) Oct 14 06:15:30 localhost ovn_controller[157396]: 2025-10-14T10:15:30Z|00155|binding|INFO|Setting lport 4b650fb3-c6b4-440a-a92e-14d19513cdc0 down in Southbound Oct 14 06:15:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:30.123 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-d3f378a7-479f-4d46-b0a6-dfc40fc2a677', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3f378a7-479f-4d46-b0a6-dfc40fc2a677', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '523b510232d6453589e91d54706d0036', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da478d7f-fb03-4a3e-a201-949626e55857, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4b650fb3-c6b4-440a-a92e-14d19513cdc0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:30.124 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 4b650fb3-c6b4-440a-a92e-14d19513cdc0 in datapath d3f378a7-479f-4d46-b0a6-dfc40fc2a677 unbound from our chassis#033[00m Oct 14 06:15:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:30.126 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d3f378a7-479f-4d46-b0a6-dfc40fc2a677, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:15:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:30.127 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[98abe0ab-de0f-47d8-852e-a814af14d80e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:30 localhost nova_compute[297686]: 2025-10-14 10:15:30.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:30 localhost nova_compute[297686]: 2025-10-14 10:15:30.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:15:30 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 5 addresses Oct 14 06:15:30 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:30 localhost podman[330135]: 2025-10-14 10:15:30.394134417 +0000 UTC m=+0.071210066 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 06:15:30 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.296 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.296 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.297 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:15:32 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:15:32 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1150212276' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:15:32 localhost nova_compute[297686]: 2025-10-14 10:15:32.749 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.067 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.069 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.289 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.291 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11256MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.292 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.292 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:15:33 localhost nova_compute[297686]: 2025-10-14 10:15:33.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.041 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.042 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.042 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.262 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:15:34 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:34.287 2 INFO neutron.agent.securitygroups_rpc [None req-64f861fe-a14d-4cf4-a2d9-d8ab32f40bf6 daa37e9562ff4164ba297586fd32a970 8e6e5d2b322d4a35bd40e5b22dbee82d - - default default] Security group member updated ['5738ce03-d625-43e9-892b-9c4d671a952f']#033[00m Oct 14 06:15:34 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:34.304 2 INFO neutron.agent.securitygroups_rpc [None req-325610bf-d019-45ec-92d9-9f436fb13e27 b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:15:34 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/931453789' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.734 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.740 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.815 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.817 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:15:34 localhost nova_compute[297686]: 2025-10-14 10:15:34.817 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:15:35 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:15:35 localhost nova_compute[297686]: 2025-10-14 10:15:35.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:35 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:35 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:35 localhost podman[330219]: 2025-10-14 10:15:35.247966243 +0000 UTC m=+0.040995601 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:35 localhost ovn_controller[157396]: 2025-10-14T10:15:35Z|00156|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:15:35 localhost nova_compute[297686]: 2025-10-14 10:15:35.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:35 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:35.649 2 INFO neutron.agent.securitygroups_rpc [None req-a8fd097b-722c-431b-846d-9b7a91a5b6ed b11f5b75a52243ed86cd4fe28898caef eff4d352999d485c9bd9a3b3cbf0c569 - - default default] Security group member updated ['25c1f9f0-ea5d-4940-9d8c-34da45a09b5d']#033[00m Oct 14 06:15:35 localhost nova_compute[297686]: 2025-10-14 10:15:35.819 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:35 localhost nova_compute[297686]: 2025-10-14 10:15:35.819 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.474151) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436936474191, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1647, "num_deletes": 251, "total_data_size": 1925846, "memory_usage": 1961056, "flush_reason": "Manual Compaction"} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436936481940, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1244054, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20216, "largest_seqno": 21858, "table_properties": {"data_size": 1238021, "index_size": 3247, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13884, "raw_average_key_size": 20, "raw_value_size": 1225419, "raw_average_value_size": 1815, "num_data_blocks": 144, "num_entries": 675, "num_filter_entries": 675, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436820, "oldest_key_time": 1760436820, "file_creation_time": 1760436936, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7838 microseconds, and 4138 cpu microseconds. Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.481987) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1244054 bytes OK Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.482013) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.483595) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.483616) EVENT_LOG_v1 {"time_micros": 1760436936483610, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.483637) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1918242, prev total WAL file size 1918242, number of live WAL files 2. Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.484488) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1214KB)], [30(15MB)] Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436936484524, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 17044285, "oldest_snapshot_seqno": -1} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12556 keys, 15106434 bytes, temperature: kUnknown Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436936553096, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 15106434, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15037912, "index_size": 36037, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31429, "raw_key_size": 339919, "raw_average_key_size": 27, "raw_value_size": 14826820, "raw_average_value_size": 1180, "num_data_blocks": 1336, "num_entries": 12556, "num_filter_entries": 12556, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760436936, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.553393) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 15106434 bytes Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.555127) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 248.3 rd, 220.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 15.1 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(25.8) write-amplify(12.1) OK, records in: 13079, records dropped: 523 output_compression: NoCompression Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.555160) EVENT_LOG_v1 {"time_micros": 1760436936555146, "job": 16, "event": "compaction_finished", "compaction_time_micros": 68651, "compaction_time_cpu_micros": 44796, "output_level": 6, "num_output_files": 1, "total_output_size": 15106434, "num_input_records": 13079, "num_output_records": 12556, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436936555471, "job": 16, "event": "table_file_deletion", "file_number": 32} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760436936557487, "job": 16, "event": "table_file_deletion", "file_number": 30} Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.484352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.557530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.557536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.557540) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.557543) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:15:36 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:15:36.557547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:15:36 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:36.623 2 INFO neutron.agent.securitygroups_rpc [None req-08f7afd9-ff65-4858-8bd4-f34f7ffad2b2 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['971079f2-c850-495f-833b-6314800b21a7']#033[00m Oct 14 06:15:36 localhost systemd[1]: tmp-crun.fwg1VA.mount: Deactivated successfully. Oct 14 06:15:36 localhost dnsmasq[328878]: exiting on receipt of SIGTERM Oct 14 06:15:36 localhost podman[330254]: 2025-10-14 10:15:36.980171694 +0000 UTC m=+0.063401504 container kill d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:15:36 localhost systemd[1]: libpod-d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef.scope: Deactivated successfully. Oct 14 06:15:37 localhost podman[330267]: 2025-10-14 10:15:37.05397589 +0000 UTC m=+0.056783680 container died d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:15:37 localhost podman[330267]: 2025-10-14 10:15:37.087334363 +0000 UTC m=+0.090142103 container cleanup d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:15:37 localhost systemd[1]: libpod-conmon-d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef.scope: Deactivated successfully. Oct 14 06:15:37 localhost podman[330268]: 2025-10-14 10:15:37.134830254 +0000 UTC m=+0.133833025 container remove d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d3f378a7-479f-4d46-b0a6-dfc40fc2a677, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:15:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:37.514 271987 INFO neutron.agent.dhcp.agent [None req-5638cd73-170a-4989-b7e3-6b7effef536a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:37.544 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:37 localhost systemd[1]: var-lib-containers-storage-overlay-1e9479996f603ff3e34b71c17ea01243896aa9ce4778c865298cabd070d4c698-merged.mount: Deactivated successfully. Oct 14 06:15:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d47e9f07f8ff27d6e2072890b5d84ea9db63fd2f4eb9c99d384c1bf1be3cc0ef-userdata-shm.mount: Deactivated successfully. Oct 14 06:15:37 localhost systemd[1]: run-netns-qdhcp\x2dd3f378a7\x2d479f\x2d4d46\x2db0a6\x2ddfc40fc2a677.mount: Deactivated successfully. Oct 14 06:15:38 localhost nova_compute[297686]: 2025-10-14 10:15:38.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:38.691 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:38 localhost openstack_network_exporter[250374]: ERROR 10:15:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:15:38 localhost openstack_network_exporter[250374]: ERROR 10:15:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:15:38 localhost openstack_network_exporter[250374]: ERROR 10:15:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:15:38 localhost openstack_network_exporter[250374]: ERROR 10:15:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:15:38 localhost openstack_network_exporter[250374]: Oct 14 06:15:38 localhost openstack_network_exporter[250374]: ERROR 10:15:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:15:38 localhost openstack_network_exporter[250374]: Oct 14 06:15:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:39 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:39.709 2 INFO neutron.agent.securitygroups_rpc [None req-ae8cfac3-21ba-4f3b-a5c6-b8d0c88fe156 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['971079f2-c850-495f-833b-6314800b21a7', '45b76fa0-7c48-480e-a89f-69be4691f61d']#033[00m Oct 14 06:15:40 localhost nova_compute[297686]: 2025-10-14 10:15:40.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:40 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:15:40 localhost podman[330312]: 2025-10-14 10:15:40.489537268 +0000 UTC m=+0.069836724 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:15:40 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:40 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:40 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:40.559 2 INFO neutron.agent.securitygroups_rpc [None req-a8d73588-3b29-4fe7-973c-8b131610fa4b daa37e9562ff4164ba297586fd32a970 8e6e5d2b322d4a35bd40e5b22dbee82d - - default default] Security group member updated ['5738ce03-d625-43e9-892b-9c4d671a952f']#033[00m Oct 14 06:15:40 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:40.918 2 INFO neutron.agent.securitygroups_rpc [None req-0a8d036a-6be5-4605-bfe1-862cd29794d4 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['45b76fa0-7c48-480e-a89f-69be4691f61d']#033[00m Oct 14 06:15:41 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:41.557 271987 INFO neutron.agent.linux.ip_lib [None req-2971e3f1-8c6d-4c67-b808-2fc8990cf3a0 - - - - - -] Device tap41eb1a2e-0d cannot be used as it has no MAC address#033[00m Oct 14 06:15:41 localhost nova_compute[297686]: 2025-10-14 10:15:41.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:41 localhost kernel: device tap41eb1a2e-0d entered promiscuous mode Oct 14 06:15:41 localhost NetworkManager[5977]: [1760436941.5949] manager: (tap41eb1a2e-0d): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Oct 14 06:15:41 localhost nova_compute[297686]: 2025-10-14 10:15:41.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:41 localhost ovn_controller[157396]: 2025-10-14T10:15:41Z|00157|binding|INFO|Claiming lport 41eb1a2e-0d25-4c86-bd1a-2707aa4a3634 for this chassis. Oct 14 06:15:41 localhost ovn_controller[157396]: 2025-10-14T10:15:41Z|00158|binding|INFO|41eb1a2e-0d25-4c86-bd1a-2707aa4a3634: Claiming unknown Oct 14 06:15:41 localhost systemd-udevd[330343]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:15:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:41.607 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-53cf8a57-fb08-40f1-9bfb-eacee579a079', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53cf8a57-fb08-40f1-9bfb-eacee579a079', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ca5e1d577fe463aa89a13e320c6dd5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9b0716a-f607-43a0-af10-76b8ffde2c13, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=41eb1a2e-0d25-4c86-bd1a-2707aa4a3634) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:41.610 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 41eb1a2e-0d25-4c86-bd1a-2707aa4a3634 in datapath 53cf8a57-fb08-40f1-9bfb-eacee579a079 bound to our chassis#033[00m Oct 14 06:15:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:41.614 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 53cf8a57-fb08-40f1-9bfb-eacee579a079 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:15:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:41.615 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[8134dba0-72c8-4de2-891d-9cf5ac51c11a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost nova_compute[297686]: 2025-10-14 10:15:41.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:41 localhost ovn_controller[157396]: 2025-10-14T10:15:41Z|00159|binding|INFO|Setting lport 41eb1a2e-0d25-4c86-bd1a-2707aa4a3634 ovn-installed in OVS Oct 14 06:15:41 localhost ovn_controller[157396]: 2025-10-14T10:15:41Z|00160|binding|INFO|Setting lport 41eb1a2e-0d25-4c86-bd1a-2707aa4a3634 up in Southbound Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost journal[237477]: ethtool ioctl error on tap41eb1a2e-0d: No such device Oct 14 06:15:41 localhost nova_compute[297686]: 2025-10-14 10:15:41.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:41 localhost nova_compute[297686]: 2025-10-14 10:15:41.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:42 localhost ovn_controller[157396]: 2025-10-14T10:15:42Z|00161|binding|INFO|Removing iface tap41eb1a2e-0d ovn-installed in OVS Oct 14 06:15:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:42.270 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 40944834-801b-47a9-9b3c-9a0c976f516f with type ""#033[00m Oct 14 06:15:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:42.271 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-53cf8a57-fb08-40f1-9bfb-eacee579a079', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53cf8a57-fb08-40f1-9bfb-eacee579a079', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ca5e1d577fe463aa89a13e320c6dd5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9b0716a-f607-43a0-af10-76b8ffde2c13, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=41eb1a2e-0d25-4c86-bd1a-2707aa4a3634) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:42.273 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 41eb1a2e-0d25-4c86-bd1a-2707aa4a3634 in datapath 53cf8a57-fb08-40f1-9bfb-eacee579a079 unbound from our chassis#033[00m Oct 14 06:15:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:42.275 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 53cf8a57-fb08-40f1-9bfb-eacee579a079 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:15:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:42.276 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e6b742-e34d-4592-8d00-76c72a6fbc25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:42 localhost ovn_controller[157396]: 2025-10-14T10:15:42Z|00162|binding|INFO|Removing lport 41eb1a2e-0d25-4c86-bd1a-2707aa4a3634 ovn-installed in OVS Oct 14 06:15:42 localhost nova_compute[297686]: 2025-10-14 10:15:42.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:15:42 localhost podman[330413]: Oct 14 06:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:15:42 localhost podman[330413]: 2025-10-14 10:15:42.667458114 +0000 UTC m=+0.080562445 container create 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 06:15:42 localhost systemd[1]: Started libpod-conmon-9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e.scope. Oct 14 06:15:42 localhost systemd[1]: tmp-crun.hE0Lq1.mount: Deactivated successfully. Oct 14 06:15:42 localhost podman[330413]: 2025-10-14 10:15:42.623896235 +0000 UTC m=+0.037000596 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:15:42 localhost systemd[1]: Started libcrun container. Oct 14 06:15:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5b53c021aed25dfb86770e0aa0cf730fe8233b79a34973f8127d2f303655398/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:15:42 localhost podman[330413]: 2025-10-14 10:15:42.748110752 +0000 UTC m=+0.161215073 container init 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:15:42 localhost podman[330436]: 2025-10-14 10:15:42.758830994 +0000 UTC m=+0.095682994 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:15:42 localhost podman[330413]: 2025-10-14 10:15:42.766933076 +0000 UTC m=+0.180037397 container start 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:15:42 localhost dnsmasq[330494]: started, version 2.85 cachesize 150 Oct 14 06:15:42 localhost dnsmasq[330494]: DNS service limited to local subnets Oct 14 06:15:42 localhost dnsmasq[330494]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:15:42 localhost dnsmasq[330494]: warning: no upstream servers configured Oct 14 06:15:42 localhost dnsmasq-dhcp[330494]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:15:42 localhost dnsmasq[330494]: read /var/lib/neutron/dhcp/53cf8a57-fb08-40f1-9bfb-eacee579a079/addn_hosts - 0 addresses Oct 14 06:15:42 localhost dnsmasq-dhcp[330494]: read /var/lib/neutron/dhcp/53cf8a57-fb08-40f1-9bfb-eacee579a079/host Oct 14 06:15:42 localhost dnsmasq-dhcp[330494]: read /var/lib/neutron/dhcp/53cf8a57-fb08-40f1-9bfb-eacee579a079/opts Oct 14 06:15:42 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:15:42 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:42 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:42 localhost podman[330475]: 2025-10-14 10:15:42.790506195 +0000 UTC m=+0.050270638 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:15:42 localhost podman[330440]: 2025-10-14 10:15:42.814760507 +0000 UTC m=+0.146626772 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm) Oct 14 06:15:42 localhost podman[330436]: 2025-10-14 10:15:42.821030991 +0000 UTC m=+0.157882981 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible) Oct 14 06:15:42 localhost podman[330440]: 2025-10-14 10:15:42.82841739 +0000 UTC m=+0.160283655 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Oct 14 06:15:42 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:15:42 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:15:42 localhost nova_compute[297686]: 2025-10-14 10:15:42.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:42 localhost kernel: device tap41eb1a2e-0d left promiscuous mode Oct 14 06:15:42 localhost nova_compute[297686]: 2025-10-14 10:15:42.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:42 localhost podman[330441]: 2025-10-14 10:15:42.917640504 +0000 UTC m=+0.245841426 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:15:42 localhost podman[330441]: 2025-10-14 10:15:42.930499951 +0000 UTC m=+0.258700923 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:15:42 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:15:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:42.944 271987 INFO neutron.agent.dhcp.agent [None req-98bbc3b6-5e6d-42c1-8b1b-07b2b5ce0794 - - - - - -] DHCP configuration for ports {'1946cbaa-52d6-40fd-a68e-b3af839ed414'} is completed#033[00m Oct 14 06:15:43 localhost dnsmasq[330494]: read /var/lib/neutron/dhcp/53cf8a57-fb08-40f1-9bfb-eacee579a079/addn_hosts - 0 addresses Oct 14 06:15:43 localhost dnsmasq-dhcp[330494]: read /var/lib/neutron/dhcp/53cf8a57-fb08-40f1-9bfb-eacee579a079/host Oct 14 06:15:43 localhost podman[330549]: 2025-10-14 10:15:43.102351765 +0000 UTC m=+0.052032823 container kill 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:15:43 localhost dnsmasq-dhcp[330494]: read /var/lib/neutron/dhcp/53cf8a57-fb08-40f1-9bfb-eacee579a079/opts Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 53cf8a57-fb08-40f1-9bfb-eacee579a079.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap41eb1a2e-0d not found in namespace qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079. Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent return fut.result() Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent return self.__get_result() Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent raise self._exception Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap41eb1a2e-0d not found in namespace qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079. Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.128 271987 ERROR neutron.agent.dhcp.agent #033[00m Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.131 271987 INFO neutron.agent.dhcp.agent [None req-2a0051eb-f11e-4e7f-bbd5-8b3e93a7ec2e - - - - - -] Synchronizing state#033[00m Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.346 271987 INFO neutron.agent.dhcp.agent [None req-c0f5770e-474a-4256-8cc7-3305d6f80f30 - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.348 271987 INFO neutron.agent.dhcp.agent [-] Starting network 53cf8a57-fb08-40f1-9bfb-eacee579a079 dhcp configuration#033[00m Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.349 271987 INFO neutron.agent.dhcp.agent [-] Finished network 53cf8a57-fb08-40f1-9bfb-eacee579a079 dhcp configuration#033[00m Oct 14 06:15:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:43.349 271987 INFO neutron.agent.dhcp.agent [None req-c0f5770e-474a-4256-8cc7-3305d6f80f30 - - - - - -] Synchronizing state complete#033[00m Oct 14 06:15:43 localhost nova_compute[297686]: 2025-10-14 10:15:43.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:43 localhost dnsmasq[330494]: exiting on receipt of SIGTERM Oct 14 06:15:43 localhost podman[330580]: 2025-10-14 10:15:43.598820282 +0000 UTC m=+0.040834786 container kill 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:15:43 localhost systemd[1]: libpod-9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e.scope: Deactivated successfully. Oct 14 06:15:43 localhost podman[330596]: 2025-10-14 10:15:43.657996274 +0000 UTC m=+0.042590880 container died 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:15:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e-userdata-shm.mount: Deactivated successfully. Oct 14 06:15:43 localhost systemd[1]: var-lib-containers-storage-overlay-c5b53c021aed25dfb86770e0aa0cf730fe8233b79a34973f8127d2f303655398-merged.mount: Deactivated successfully. Oct 14 06:15:43 localhost podman[330596]: 2025-10-14 10:15:43.697781836 +0000 UTC m=+0.082376492 container remove 9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53cf8a57-fb08-40f1-9bfb-eacee579a079, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:15:43 localhost ovn_controller[157396]: 2025-10-14T10:15:43Z|00163|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:15:43 localhost systemd[1]: libpod-conmon-9243eb9893a7d4fd12831c0fc89abbbcddf4ac82b3773bacac6d76c881cbb93e.scope: Deactivated successfully. Oct 14 06:15:43 localhost systemd[1]: run-netns-qdhcp\x2d53cf8a57\x2dfb08\x2d40f1\x2d9bfb\x2deacee579a079.mount: Deactivated successfully. Oct 14 06:15:43 localhost nova_compute[297686]: 2025-10-14 10:15:43.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:44 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:15:44 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:44 localhost systemd[1]: tmp-crun.cOBo2W.mount: Deactivated successfully. Oct 14 06:15:44 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:44 localhost podman[330639]: 2025-10-14 10:15:44.847805746 +0000 UTC m=+0.067016066 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS) Oct 14 06:15:44 localhost ovn_controller[157396]: 2025-10-14T10:15:44Z|00164|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:15:45 localhost nova_compute[297686]: 2025-10-14 10:15:45.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:45 localhost nova_compute[297686]: 2025-10-14 10:15:45.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:46 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:46.343 2 INFO neutron.agent.securitygroups_rpc [None req-502c9346-9c9b-4520-9303-69df7912cd6d 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['c7522168-79ac-4334-a811-1abcc722b92a']#033[00m Oct 14 06:15:48 localhost nova_compute[297686]: 2025-10-14 10:15:48.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:49 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:49.744 2 INFO neutron.agent.securitygroups_rpc [None req-4ce9037b-3a14-4802-8951-61a892782857 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['04a003a3-9634-4c19-bd44-c2ff00c6dace', 'c7522168-79ac-4334-a811-1abcc722b92a', '64c7cb4a-1e23-4a29-b5a6-11af05e1b20e']#033[00m Oct 14 06:15:50 localhost nova_compute[297686]: 2025-10-14 10:15:50.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:50 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:50.699 2 INFO neutron.agent.securitygroups_rpc [None req-ead15db7-821b-4ac4-934e-f2fb9da96f57 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['04a003a3-9634-4c19-bd44-c2ff00c6dace', '64c7cb4a-1e23-4a29-b5a6-11af05e1b20e']#033[00m Oct 14 06:15:50 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:50.934 2 INFO neutron.agent.securitygroups_rpc [None req-096ef0b6-80e0-4a9f-bb4f-2c49854ab138 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['3dc93998-b54b-4d14-b147-1dfdbe73ed61']#033[00m Oct 14 06:15:51 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:51.008 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:51 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:51.010 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:15:51 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:51.013 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:15:51 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:51.014 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[43f28615-2ef9-4f65-a772-e2c6b7af483e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:51 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:51.344 2 INFO neutron.agent.securitygroups_rpc [None req-b689f1db-8f1b-40c9-a2dc-453b39103118 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['3dc93998-b54b-4d14-b147-1dfdbe73ed61']#033[00m Oct 14 06:15:51 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:51.874 2 INFO neutron.agent.securitygroups_rpc [None req-33978405-9b76-48c0-9344-3eec93a44254 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:53 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:53.158 2 INFO neutron.agent.securitygroups_rpc [None req-24b63eb0-8d95-4362-967a-7a391c8c282f da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:53 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:53.232 2 INFO neutron.agent.securitygroups_rpc [None req-f2964234-381e-4863-ab6a-4afc28efe02d 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:53 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:53.302 271987 INFO neutron.agent.linux.ip_lib [None req-9ad1e35d-1eb0-4891-8dcc-a92c19997490 - - - - - -] Device tap1df01de3-c1 cannot be used as it has no MAC address#033[00m Oct 14 06:15:53 localhost nova_compute[297686]: 2025-10-14 10:15:53.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:53 localhost kernel: device tap1df01de3-c1 entered promiscuous mode Oct 14 06:15:53 localhost nova_compute[297686]: 2025-10-14 10:15:53.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:53 localhost NetworkManager[5977]: [1760436953.3802] manager: (tap1df01de3-c1): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Oct 14 06:15:53 localhost ovn_controller[157396]: 2025-10-14T10:15:53Z|00165|binding|INFO|Claiming lport 1df01de3-c11c-40e3-8802-ea137fe51f0c for this chassis. Oct 14 06:15:53 localhost ovn_controller[157396]: 2025-10-14T10:15:53Z|00166|binding|INFO|1df01de3-c11c-40e3-8802-ea137fe51f0c: Claiming unknown Oct 14 06:15:53 localhost systemd-udevd[330672]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:15:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:53.391 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bf1be3a6a454996a4414fad306906f1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64127a4b-0baf-4336-8658-a60a67ebf24c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1df01de3-c11c-40e3-8802-ea137fe51f0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:53.392 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 1df01de3-c11c-40e3-8802-ea137fe51f0c in datapath 5d8fe93a-c65a-4669-ba1e-66d52ee61c6a bound to our chassis#033[00m Oct 14 06:15:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:53.394 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 348ef64a-06c7-4ee5-ab07-863a81a0452a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:15:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:53.395 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:15:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:53.396 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0d1159-9ed4-4316-a0a1-13805fabab03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost nova_compute[297686]: 2025-10-14 10:15:53.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:53 localhost ovn_controller[157396]: 2025-10-14T10:15:53Z|00167|binding|INFO|Setting lport 1df01de3-c11c-40e3-8802-ea137fe51f0c ovn-installed in OVS Oct 14 06:15:53 localhost ovn_controller[157396]: 2025-10-14T10:15:53Z|00168|binding|INFO|Setting lport 1df01de3-c11c-40e3-8802-ea137fe51f0c up in Southbound Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost journal[237477]: ethtool ioctl error on tap1df01de3-c1: No such device Oct 14 06:15:53 localhost nova_compute[297686]: 2025-10-14 10:15:53.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:53 localhost nova_compute[297686]: 2025-10-14 10:15:53.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:53 localhost nova_compute[297686]: 2025-10-14 10:15:53.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:53 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:53.573 2 INFO neutron.agent.securitygroups_rpc [None req-498638c2-b9d1-4b78-9943-dc4bba565a59 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:53 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:53.860 2 INFO neutron.agent.securitygroups_rpc [None req-72716ee2-7b94-43d2-a8a0-d7da6ac1202b da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:54.344 2 INFO neutron.agent.securitygroups_rpc [None req-2da02acd-ddd6-4284-9158-a3cd85597c4e da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:54 localhost podman[330743]: Oct 14 06:15:54 localhost podman[330743]: 2025-10-14 10:15:54.440161189 +0000 UTC m=+0.092716262 container create 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:15:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:15:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:15:54 localhost systemd[1]: Started libpod-conmon-89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b.scope. Oct 14 06:15:54 localhost systemd[1]: Started libcrun container. Oct 14 06:15:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e7c7447734ddb5f9fd6704c31b2990fd25a9066ea4c8faf16955ae90df9fd8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:15:54 localhost podman[330743]: 2025-10-14 10:15:54.400437909 +0000 UTC m=+0.052992992 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:15:54 localhost podman[330743]: 2025-10-14 10:15:54.505014188 +0000 UTC m=+0.157569271 container init 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:15:54 localhost podman[330743]: 2025-10-14 10:15:54.516454292 +0000 UTC m=+0.169009365 container start 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:15:54 localhost dnsmasq[330782]: started, version 2.85 cachesize 150 Oct 14 06:15:54 localhost dnsmasq[330782]: DNS service limited to local subnets Oct 14 06:15:54 localhost dnsmasq[330782]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:15:54 localhost dnsmasq[330782]: warning: no upstream servers configured Oct 14 06:15:54 localhost dnsmasq-dhcp[330782]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:15:54 localhost dnsmasq[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/addn_hosts - 0 addresses Oct 14 06:15:54 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/host Oct 14 06:15:54 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/opts Oct 14 06:15:54 localhost podman[330758]: 2025-10-14 10:15:54.559796154 +0000 UTC m=+0.082015601 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:15:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:54.568 271987 INFO neutron.agent.dhcp.agent [None req-d2d77d39-735f-42dd-832b-0cd0e46f8a5e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:53Z, description=, device_id=e051df2a-6c99-40d0-bcd5-cf988a4b8298, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=67ff3608-5241-432b-888f-4b5d5f55b2de, ip_allocation=immediate, mac_address=fa:16:3e:ce:78:c3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:15:49Z, description=, dns_domain=, id=5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1825085830, port_security_enabled=True, project_id=7bf1be3a6a454996a4414fad306906f1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21621, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1859, status=ACTIVE, subnets=['b4298a7c-7bfc-4cf2-9139-a25be14cde8e'], tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:15:50Z, vlan_transparent=None, network_id=5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, port_security_enabled=False, project_id=7bf1be3a6a454996a4414fad306906f1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1894, status=DOWN, tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:15:53Z on network 5d8fe93a-c65a-4669-ba1e-66d52ee61c6a#033[00m Oct 14 06:15:54 localhost podman[330758]: 2025-10-14 10:15:54.589804874 +0000 UTC m=+0.112024331 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:15:54 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:15:54 localhost podman[330756]: 2025-10-14 10:15:54.607285225 +0000 UTC m=+0.132495964 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:15:54 localhost podman[330756]: 2025-10-14 10:15:54.621104333 +0000 UTC m=+0.146315042 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:15:54 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:15:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:54.695 271987 INFO neutron.agent.dhcp.agent [None req-1e81693f-a512-4a69-987a-a4e5e1d1ea95 - - - - - -] DHCP configuration for ports {'06441965-bc61-4b4e-b2e2-01d52f9229e8'} is completed#033[00m Oct 14 06:15:54 localhost dnsmasq[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/addn_hosts - 1 addresses Oct 14 06:15:54 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/host Oct 14 06:15:54 localhost podman[330820]: 2025-10-14 10:15:54.809432807 +0000 UTC m=+0.066895683 container kill 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 06:15:54 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/opts Oct 14 06:15:55 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:55.245 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:53Z, description=, device_id=e051df2a-6c99-40d0-bcd5-cf988a4b8298, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=67ff3608-5241-432b-888f-4b5d5f55b2de, ip_allocation=immediate, mac_address=fa:16:3e:ce:78:c3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:15:49Z, description=, dns_domain=, id=5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1825085830, port_security_enabled=True, project_id=7bf1be3a6a454996a4414fad306906f1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21621, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1859, status=ACTIVE, subnets=['b4298a7c-7bfc-4cf2-9139-a25be14cde8e'], tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:15:50Z, vlan_transparent=None, network_id=5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, port_security_enabled=False, project_id=7bf1be3a6a454996a4414fad306906f1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1894, status=DOWN, tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:15:53Z on network 5d8fe93a-c65a-4669-ba1e-66d52ee61c6a#033[00m Oct 14 06:15:55 localhost nova_compute[297686]: 2025-10-14 10:15:55.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:55.332 2 INFO neutron.agent.securitygroups_rpc [None req-d60dd72d-6d8e-4c14-abe5-931d143b618a da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:55 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:55.357 271987 INFO neutron.agent.dhcp.agent [None req-c7758190-b730-4051-a499-f24f6693ced8 - - - - - -] DHCP configuration for ports {'67ff3608-5241-432b-888f-4b5d5f55b2de'} is completed#033[00m Oct 14 06:15:55 localhost dnsmasq[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/addn_hosts - 1 addresses Oct 14 06:15:55 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/host Oct 14 06:15:55 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/opts Oct 14 06:15:55 localhost podman[330859]: 2025-10-14 10:15:55.535804344 +0000 UTC m=+0.062899929 container kill 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:15:55 localhost nova_compute[297686]: 2025-10-14 10:15:55.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:55.546 2 INFO neutron.agent.securitygroups_rpc [None req-eeb0ba3d-bd8e-48e3-811c-a42098bbbefd 30647d4700b846dba79efd27fad03f3d a840994a70374548889747682f4c0fa3 - - default default] Security group member updated ['59283390-a499-4358-9f49-155fd8075ea9']#033[00m Oct 14 06:15:55 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:55.711 2 INFO neutron.agent.securitygroups_rpc [None req-2e7436e6-7cca-4505-95cf-dd9c675a1826 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:55 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:55.825 271987 INFO neutron.agent.dhcp.agent [None req-787657ee-c079-4650-a3b9-8e1bd383a497 - - - - - -] DHCP configuration for ports {'67ff3608-5241-432b-888f-4b5d5f55b2de'} is completed#033[00m Oct 14 06:15:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:56.192 2 INFO neutron.agent.securitygroups_rpc [None req-ad06b835-86f2-46d5-a890-f849c23f39ec da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:56.539 2 INFO neutron.agent.securitygroups_rpc [None req-9a0507c0-d895-4954-b467-687beeb58a32 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:56.749 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 2001:db8::f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:56.751 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:15:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:56.755 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:15:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:56.756 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[42b7dbfe-edaf-4e1d-9357-568c89704974]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:56.965 2 INFO neutron.agent.securitygroups_rpc [None req-03cd8842-580c-49e4-9ac0-37b2fed98c6e da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:57.136 271987 INFO neutron.agent.dhcp.agent [None req-c0f5770e-474a-4256-8cc7-3305d6f80f30 - - - - - -] Synchronizing state#033[00m Oct 14 06:15:57 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:57.261 2 INFO neutron.agent.securitygroups_rpc [None req-f8d4a5b8-3724-4c86-9983-a6aa33844a19 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['d81eaca5-41d5-465a-ae37-475fd17fd0b7']#033[00m Oct 14 06:15:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:57.322 271987 INFO neutron.agent.dhcp.agent [None req-743c6502-1246-4b94-9d82-8486207e486c - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:15:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:57.323 271987 INFO neutron.agent.dhcp.agent [-] Starting network ec49d466-3f9d-443d-a4d1-90a54d0e5427 dhcp configuration#033[00m Oct 14 06:15:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:57.323 271987 INFO neutron.agent.dhcp.agent [-] Finished network ec49d466-3f9d-443d-a4d1-90a54d0e5427 dhcp configuration#033[00m Oct 14 06:15:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:57.324 271987 INFO neutron.agent.dhcp.agent [None req-743c6502-1246-4b94-9d82-8486207e486c - - - - - -] Synchronizing state complete#033[00m Oct 14 06:15:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:57.324 271987 INFO neutron.agent.dhcp.agent [None req-afab1e7a-4ff0-4382-a3b5-3b5ca305ee25 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:57.784 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:15:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:57.785 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:15:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:57.785 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:15:57 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:57.917 2 INFO neutron.agent.securitygroups_rpc [None req-19a13449-e088-4f71-a3bc-b16d734311de da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['bd99b9ee-6283-4002-9bd9-0f280baab2b9']#033[00m Oct 14 06:15:58 localhost podman[248187]: time="2025-10-14T10:15:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:15:58 localhost podman[248187]: @ - - [14/Oct/2025:10:15:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149321 "" "Go-http-client/1.1" Oct 14 06:15:58 localhost podman[248187]: @ - - [14/Oct/2025:10:15:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20330 "" "Go-http-client/1.1" Oct 14 06:15:58 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:58.436 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:15:58 localhost nova_compute[297686]: 2025-10-14 10:15:58.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:15:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:58.586 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:15:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:58.587 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:15:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:58.589 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:15:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:15:58.590 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[1873008e-1c9f-43e9-a819-58f5dbdb169a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:15:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:15:59 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:59.462 2 INFO neutron.agent.securitygroups_rpc [None req-748b12f7-75b2-42a7-bed1-ec166bf1086f bcbb7ceb87a845dd957d390724b3aa7b 260dac1713714ac8bb2b6f2a6df5daab - - default default] Security group member updated ['04031ec2-60f0-4ddf-a977-de00155ea50e']#033[00m Oct 14 06:15:59 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:15:59.517 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:15:59Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=48b44cac-a3db-4d8f-ad37-dffcfdeacb5e, ip_allocation=immediate, mac_address=fa:16:3e:44:b9:72, name=tempest-RoutersAdminNegativeIpV6Test-1702458018, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=True, project_id=260dac1713714ac8bb2b6f2a6df5daab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['04031ec2-60f0-4ddf-a977-de00155ea50e'], standard_attr_id=1945, status=DOWN, tags=[], tenant_id=260dac1713714ac8bb2b6f2a6df5daab, updated_at=2025-10-14T10:15:59Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:15:59 localhost podman[330898]: 2025-10-14 10:15:59.698481834 +0000 UTC m=+0.038221265 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:15:59 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:15:59 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:15:59 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:15:59 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:59.733 2 INFO neutron.agent.securitygroups_rpc [None req-a482116e-ce6a-49ae-9cd1-b4bde06df35b 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:15:59 localhost neutron_sriov_agent[264974]: 2025-10-14 10:15:59.941 2 INFO neutron.agent.securitygroups_rpc [None req-9f851c30-a877-40be-a2bf-a08637c72ba4 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['647ac1cf-251c-49bd-bd44-f4aca2680cd7']#033[00m Oct 14 06:16:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:00.002 271987 INFO neutron.agent.dhcp.agent [None req-5841f161-97bb-4827-ae17-450ae1ed5386 - - - - - -] DHCP configuration for ports {'48b44cac-a3db-4d8f-ad37-dffcfdeacb5e'} is completed#033[00m Oct 14 06:16:00 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:00.160 2 INFO neutron.agent.securitygroups_rpc [None req-7f0d63ad-e0c7-474e-b522-3c3bf7ffc024 1bd6c282bd5f479c9ccbe1c6315d2b30 144ffc90564548b79f70d01b768b605c - - default default] Security group member updated ['e1fabd25-5362-4883-952c-8d61e716234f']#033[00m Oct 14 06:16:00 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:00.165 2 INFO neutron.agent.securitygroups_rpc [None req-796fb526-3ce5-4f0a-baa4-d73f37bca3e3 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['647ac1cf-251c-49bd-bd44-f4aca2680cd7']#033[00m Oct 14 06:16:00 localhost nova_compute[297686]: 2025-10-14 10:16:00.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:16:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:16:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:16:00 localhost podman[330921]: 2025-10-14 10:16:00.734318748 +0000 UTC m=+0.069980459 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, managed_by=edpm_ansible, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:16:00 localhost podman[330921]: 2025-10-14 10:16:00.766908546 +0000 UTC m=+0.102570267 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=iscsid) Oct 14 06:16:00 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:00.772 2 INFO neutron.agent.securitygroups_rpc [None req-db8e9931-3790-4c5a-8b7e-409a585f1b0b 1bd6c282bd5f479c9ccbe1c6315d2b30 144ffc90564548b79f70d01b768b605c - - default default] Security group member updated ['e1fabd25-5362-4883-952c-8d61e716234f']#033[00m Oct 14 06:16:00 localhost podman[330919]: 2025-10-14 10:16:00.722387907 +0000 UTC m=+0.066939804 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd) Oct 14 06:16:00 localhost systemd[1]: tmp-crun.GjQMk2.mount: Deactivated successfully. Oct 14 06:16:00 localhost podman[330920]: 2025-10-14 10:16:00.783624784 +0000 UTC m=+0.125079825 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:16:00 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:16:00 localhost podman[330920]: 2025-10-14 10:16:00.787782774 +0000 UTC m=+0.129237755 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:16:00 localhost podman[330919]: 2025-10-14 10:16:00.803416417 +0000 UTC m=+0.147968254 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:16:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:00.808 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:00 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:16:00 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:16:02 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:02.021 2 INFO neutron.agent.securitygroups_rpc [None req-aad7157c-9e35-462e-8976-eab83204c7ad 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:02 localhost nova_compute[297686]: 2025-10-14 10:16:02.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:02 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:02.111 2 INFO neutron.agent.securitygroups_rpc [None req-5266bc6a-e668-45a5-ad9f-f03190787ee1 bcbb7ceb87a845dd957d390724b3aa7b 260dac1713714ac8bb2b6f2a6df5daab - - default default] Security group member updated ['04031ec2-60f0-4ddf-a977-de00155ea50e']#033[00m Oct 14 06:16:02 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:02.267 2 INFO neutron.agent.securitygroups_rpc [None req-eba3964c-6077-4898-9ae9-96c06b6bd458 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['411e7128-6eb3-4bfe-814c-1d1cb5173c3b']#033[00m Oct 14 06:16:02 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:16:02 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:02 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:02 localhost podman[330996]: 2025-10-14 10:16:02.325860612 +0000 UTC m=+0.046891734 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:16:02 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:02.800 2 INFO neutron.agent.securitygroups_rpc [None req-c6d26a90-39b6-4ac3-8245-cf5b676729dc da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['411e7128-6eb3-4bfe-814c-1d1cb5173c3b']#033[00m Oct 14 06:16:03 localhost nova_compute[297686]: 2025-10-14 10:16:03.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:16:03 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4283205683' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:16:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:16:03 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4283205683' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:16:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:04.551 2 INFO neutron.agent.securitygroups_rpc [None req-a9ac4c2b-3396-40cc-a43d-86ae8a282899 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['357eb12a-bd5c-457e-b498-fb7d07e886ba']#033[00m Oct 14 06:16:05 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:05.032 2 INFO neutron.agent.securitygroups_rpc [None req-3c1ae6ba-b890-432a-83bd-25fd553cee0e da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['357eb12a-bd5c-457e-b498-fb7d07e886ba']#033[00m Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:05 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:05.600 2 INFO neutron.agent.securitygroups_rpc [None req-f190560d-7f11-4e1c-a5d5-63828c94af63 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['357eb12a-bd5c-457e-b498-fb7d07e886ba']#033[00m Oct 14 06:16:05 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:05.616 271987 INFO neutron.agent.linux.ip_lib [None req-e1a55453-8563-4429-977c-19e164ac76e0 - - - - - -] Device tap7a9cfa63-0b cannot be used as it has no MAC address#033[00m Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:05 localhost kernel: device tap7a9cfa63-0b entered promiscuous mode Oct 14 06:16:05 localhost ovn_controller[157396]: 2025-10-14T10:16:05Z|00169|binding|INFO|Claiming lport 7a9cfa63-0bc5-4880-b1be-364e29d25876 for this chassis. Oct 14 06:16:05 localhost NetworkManager[5977]: [1760436965.6452] manager: (tap7a9cfa63-0b): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Oct 14 06:16:05 localhost ovn_controller[157396]: 2025-10-14T10:16:05Z|00170|binding|INFO|7a9cfa63-0bc5-4880-b1be-364e29d25876: Claiming unknown Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:05 localhost systemd-udevd[331027]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.656 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-61834488-97da-47c3-8374-b9fea3c2b7e5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61834488-97da-47c3-8374-b9fea3c2b7e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bf1be3a6a454996a4414fad306906f1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b1ab247-44e5-4002-a4db-e97bbb68024f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7a9cfa63-0bc5-4880-b1be-364e29d25876) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.657 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 7a9cfa63-0bc5-4880-b1be-364e29d25876 in datapath 61834488-97da-47c3-8374-b9fea3c2b7e5 bound to our chassis#033[00m Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.660 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 61834488-97da-47c3-8374-b9fea3c2b7e5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:16:05 localhost ovn_controller[157396]: 2025-10-14T10:16:05Z|00171|binding|INFO|Setting lport 7a9cfa63-0bc5-4880-b1be-364e29d25876 ovn-installed in OVS Oct 14 06:16:05 localhost ovn_controller[157396]: 2025-10-14T10:16:05Z|00172|binding|INFO|Setting lport 7a9cfa63-0bc5-4880-b1be-364e29d25876 up in Southbound Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.660 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[d81d952e-84cf-4b17-8a59-810259e3d344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.706 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.707 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.710 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:05 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:05.711 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c24f9a23-be56-4b33-8071-22ccc1b20a40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:05 localhost nova_compute[297686]: 2025-10-14 10:16:05.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:06 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:06.161 2 INFO neutron.agent.securitygroups_rpc [None req-64203ab7-c72d-4d68-9ffa-dd58b94ea77b da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['357eb12a-bd5c-457e-b498-fb7d07e886ba']#033[00m Oct 14 06:16:06 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:06.454 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:06 localhost podman[331083]: Oct 14 06:16:06 localhost podman[331083]: 2025-10-14 10:16:06.557126925 +0000 UTC m=+0.090541875 container create b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:16:06 localhost systemd[1]: Started libpod-conmon-b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113.scope. Oct 14 06:16:06 localhost podman[331083]: 2025-10-14 10:16:06.510861962 +0000 UTC m=+0.044276942 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:16:06 localhost systemd[1]: Started libcrun container. Oct 14 06:16:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5b7fe8b1bc557af7c5bfe195bb44ff3a8bacccf93d39f1e928d229dad49f0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:16:06 localhost podman[331083]: 2025-10-14 10:16:06.627817524 +0000 UTC m=+0.161232424 container init b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Oct 14 06:16:06 localhost podman[331083]: 2025-10-14 10:16:06.637401632 +0000 UTC m=+0.170816572 container start b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:16:06 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:06.637 2 INFO neutron.agent.securitygroups_rpc [None req-e7d6730b-f256-42fe-acb2-b345ff83af61 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['357eb12a-bd5c-457e-b498-fb7d07e886ba']#033[00m Oct 14 06:16:06 localhost dnsmasq[331101]: started, version 2.85 cachesize 150 Oct 14 06:16:06 localhost dnsmasq[331101]: DNS service limited to local subnets Oct 14 06:16:06 localhost dnsmasq[331101]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:16:06 localhost dnsmasq[331101]: warning: no upstream servers configured Oct 14 06:16:06 localhost dnsmasq-dhcp[331101]: DHCP, static leases only on 10.102.0.0, lease time 1d Oct 14 06:16:06 localhost dnsmasq[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/addn_hosts - 0 addresses Oct 14 06:16:06 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/host Oct 14 06:16:06 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/opts Oct 14 06:16:06 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:06.786 271987 INFO neutron.agent.dhcp.agent [None req-163867f9-af39-441b-9620-3405a11d255b - - - - - -] DHCP configuration for ports {'b58c3110-2daf-44b1-8b3a-860c46a16c7a'} is completed#033[00m Oct 14 06:16:07 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:07.871 2 INFO neutron.agent.securitygroups_rpc [None req-16103675-3f14-40f2-b81a-2b89b2207f51 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['357eb12a-bd5c-457e-b498-fb7d07e886ba']#033[00m Oct 14 06:16:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:08.217 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:08 localhost nova_compute[297686]: 2025-10-14 10:16:08.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:08.638 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:08Z, description=, device_id=e051df2a-6c99-40d0-bcd5-cf988a4b8298, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b4347b9-4f0e-4d11-a0a3-8b2674ff69b2, ip_allocation=immediate, mac_address=fa:16:3e:72:d9:ea, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:16:02Z, description=, dns_domain=, id=61834488-97da-47c3-8374-b9fea3c2b7e5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-594589185, port_security_enabled=True, project_id=7bf1be3a6a454996a4414fad306906f1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11756, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1961, status=ACTIVE, subnets=['ff1fa59e-0148-429f-946c-c78e0e8a255a'], tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:16:04Z, vlan_transparent=None, network_id=61834488-97da-47c3-8374-b9fea3c2b7e5, port_security_enabled=False, project_id=7bf1be3a6a454996a4414fad306906f1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2010, status=DOWN, tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:16:08Z on network 61834488-97da-47c3-8374-b9fea3c2b7e5#033[00m Oct 14 06:16:08 localhost openstack_network_exporter[250374]: ERROR 10:16:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:16:08 localhost openstack_network_exporter[250374]: ERROR 10:16:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:16:08 localhost openstack_network_exporter[250374]: ERROR 10:16:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:16:08 localhost openstack_network_exporter[250374]: ERROR 10:16:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:16:08 localhost openstack_network_exporter[250374]: Oct 14 06:16:08 localhost openstack_network_exporter[250374]: ERROR 10:16:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:16:08 localhost openstack_network_exporter[250374]: Oct 14 06:16:08 localhost dnsmasq[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/addn_hosts - 1 addresses Oct 14 06:16:08 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/host Oct 14 06:16:08 localhost podman[331119]: 2025-10-14 10:16:08.849963541 +0000 UTC m=+0.044916713 container kill b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:16:08 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/opts Oct 14 06:16:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:09.078 271987 INFO neutron.agent.dhcp.agent [None req-c9575d95-cdfb-48bc-aeb3-4ed8808e97c6 - - - - - -] DHCP configuration for ports {'7b4347b9-4f0e-4d11-a0a3-8b2674ff69b2'} is completed#033[00m Oct 14 06:16:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:09 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:09.576 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:09 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:09.577 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:16:09 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:09.578 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:09 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:09.579 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[981fb7e1-45c0-4ff6-966c-c1664662bc24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:09 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:09.814 2 INFO neutron.agent.securitygroups_rpc [None req-c1a66779-28ca-41ab-8a3a-9fc04bd41773 da88dc55c7044cbba38f975c7e0b048b ad642aabc86d4ac1b3d38b6fe087eb44 - - default default] Security group rule updated ['1b366e00-8855-4b43-9b4b-e7499389da43']#033[00m Oct 14 06:16:10 localhost nova_compute[297686]: 2025-10-14 10:16:10.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:11.242 271987 INFO neutron.agent.linux.ip_lib [None req-e37be96b-b1d9-412e-9d86-a3d513ea5e71 - - - - - -] Device tapda27e3e0-13 cannot be used as it has no MAC address#033[00m Oct 14 06:16:11 localhost nova_compute[297686]: 2025-10-14 10:16:11.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost kernel: device tapda27e3e0-13 entered promiscuous mode Oct 14 06:16:11 localhost NetworkManager[5977]: [1760436971.2738] manager: (tapda27e3e0-13): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Oct 14 06:16:11 localhost nova_compute[297686]: 2025-10-14 10:16:11.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost systemd-udevd[331150]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:16:11 localhost ovn_controller[157396]: 2025-10-14T10:16:11Z|00173|binding|INFO|Claiming lport da27e3e0-13a0-4974-aa0d-030ac455a157 for this chassis. Oct 14 06:16:11 localhost ovn_controller[157396]: 2025-10-14T10:16:11Z|00174|binding|INFO|da27e3e0-13a0-4974-aa0d-030ac455a157: Claiming unknown Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:11.314 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-1df4417c-0ac5-4fd2-bddd-03b727bc48b3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1df4417c-0ac5-4fd2-bddd-03b727bc48b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ca5e1d577fe463aa89a13e320c6dd5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4b6a919-b709-4bb9-9806-dcfc94f77b17, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da27e3e0-13a0-4974-aa0d-030ac455a157) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:11.316 163055 INFO neutron.agent.ovn.metadata.agent [-] Port da27e3e0-13a0-4974-aa0d-030ac455a157 in datapath 1df4417c-0ac5-4fd2-bddd-03b727bc48b3 bound to our chassis#033[00m Oct 14 06:16:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:11.318 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1df4417c-0ac5-4fd2-bddd-03b727bc48b3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:16:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:11.319 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[1acabf0f-855f-4a32-aef2-d5e99d384064]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost nova_compute[297686]: 2025-10-14 10:16:11.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost ovn_controller[157396]: 2025-10-14T10:16:11Z|00175|binding|INFO|Setting lport da27e3e0-13a0-4974-aa0d-030ac455a157 ovn-installed in OVS Oct 14 06:16:11 localhost ovn_controller[157396]: 2025-10-14T10:16:11Z|00176|binding|INFO|Setting lport da27e3e0-13a0-4974-aa0d-030ac455a157 up in Southbound Oct 14 06:16:11 localhost nova_compute[297686]: 2025-10-14 10:16:11.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost journal[237477]: ethtool ioctl error on tapda27e3e0-13: No such device Oct 14 06:16:11 localhost nova_compute[297686]: 2025-10-14 10:16:11.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost nova_compute[297686]: 2025-10-14 10:16:11.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:11 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:11.577 2 INFO neutron.agent.securitygroups_rpc [None req-61a21130-b43a-4480-a1a6-dff95d56ba6f 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:12 localhost podman[331221]: Oct 14 06:16:12 localhost podman[331221]: 2025-10-14 10:16:12.155885024 +0000 UTC m=+0.080706591 container create fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:16:12 localhost systemd[1]: Started libpod-conmon-fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f.scope. Oct 14 06:16:12 localhost systemd[1]: Started libcrun container. Oct 14 06:16:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38a9048af44fc3ec298294d1acc77e4f9544621724d603cdfe91ed6211abff87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:16:12 localhost podman[331221]: 2025-10-14 10:16:12.118578438 +0000 UTC m=+0.043399975 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:16:12 localhost podman[331221]: 2025-10-14 10:16:12.222530728 +0000 UTC m=+0.147352325 container init fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:16:12 localhost podman[331221]: 2025-10-14 10:16:12.233496048 +0000 UTC m=+0.158317605 container start fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 06:16:12 localhost dnsmasq[331240]: started, version 2.85 cachesize 150 Oct 14 06:16:12 localhost dnsmasq[331240]: DNS service limited to local subnets Oct 14 06:16:12 localhost dnsmasq[331240]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:16:12 localhost dnsmasq[331240]: warning: no upstream servers configured Oct 14 06:16:12 localhost dnsmasq-dhcp[331240]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:16:12 localhost dnsmasq[331240]: read /var/lib/neutron/dhcp/1df4417c-0ac5-4fd2-bddd-03b727bc48b3/addn_hosts - 0 addresses Oct 14 06:16:12 localhost dnsmasq-dhcp[331240]: read /var/lib/neutron/dhcp/1df4417c-0ac5-4fd2-bddd-03b727bc48b3/host Oct 14 06:16:12 localhost dnsmasq-dhcp[331240]: read /var/lib/neutron/dhcp/1df4417c-0ac5-4fd2-bddd-03b727bc48b3/opts Oct 14 06:16:12 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:12.370 271987 INFO neutron.agent.dhcp.agent [None req-6434f37f-771e-4801-9b7f-a46eda0ff749 - - - - - -] DHCP configuration for ports {'fc9a85cf-3953-4532-a6d5-45df55798050'} is completed#033[00m Oct 14 06:16:12 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:12.396 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:08Z, description=, device_id=e051df2a-6c99-40d0-bcd5-cf988a4b8298, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b4347b9-4f0e-4d11-a0a3-8b2674ff69b2, ip_allocation=immediate, mac_address=fa:16:3e:72:d9:ea, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:16:02Z, description=, dns_domain=, id=61834488-97da-47c3-8374-b9fea3c2b7e5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-594589185, port_security_enabled=True, project_id=7bf1be3a6a454996a4414fad306906f1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11756, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1961, status=ACTIVE, subnets=['ff1fa59e-0148-429f-946c-c78e0e8a255a'], tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:16:04Z, vlan_transparent=None, network_id=61834488-97da-47c3-8374-b9fea3c2b7e5, port_security_enabled=False, project_id=7bf1be3a6a454996a4414fad306906f1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2010, status=DOWN, tags=[], tenant_id=7bf1be3a6a454996a4414fad306906f1, updated_at=2025-10-14T10:16:08Z on network 61834488-97da-47c3-8374-b9fea3c2b7e5#033[00m Oct 14 06:16:12 localhost dnsmasq[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/addn_hosts - 1 addresses Oct 14 06:16:12 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/host Oct 14 06:16:12 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/opts Oct 14 06:16:12 localhost podman[331277]: 2025-10-14 10:16:12.553750697 +0000 UTC m=+0.034826310 container kill b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:16:12 localhost dnsmasq[331240]: read /var/lib/neutron/dhcp/1df4417c-0ac5-4fd2-bddd-03b727bc48b3/addn_hosts - 0 addresses Oct 14 06:16:12 localhost dnsmasq-dhcp[331240]: read /var/lib/neutron/dhcp/1df4417c-0ac5-4fd2-bddd-03b727bc48b3/host Oct 14 06:16:12 localhost podman[331270]: 2025-10-14 10:16:12.605462489 +0000 UTC m=+0.098851423 container kill fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:16:12 localhost dnsmasq-dhcp[331240]: read /var/lib/neutron/dhcp/1df4417c-0ac5-4fd2-bddd-03b727bc48b3/opts Oct 14 06:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:16:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:13.212 271987 INFO neutron.agent.dhcp.agent [None req-d37948f9-8c8f-453a-a676-f8323d6c60e1 - - - - - -] DHCP configuration for ports {'fc9a85cf-3953-4532-a6d5-45df55798050', '7b4347b9-4f0e-4d11-a0a3-8b2674ff69b2', 'da27e3e0-13a0-4974-aa0d-030ac455a157'} is completed#033[00m Oct 14 06:16:13 localhost systemd[1]: tmp-crun.wlQA7o.mount: Deactivated successfully. Oct 14 06:16:13 localhost podman[331316]: 2025-10-14 10:16:13.268484955 +0000 UTC m=+0.103909899 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible) Oct 14 06:16:13 localhost podman[331317]: 2025-10-14 10:16:13.306649716 +0000 UTC m=+0.137236911 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Oct 14 06:16:13 localhost podman[331316]: 2025-10-14 10:16:13.313368715 +0000 UTC m=+0.148793629 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Oct 14 06:16:13 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:16:13 localhost podman[331315]: 2025-10-14 10:16:13.366325135 +0000 UTC m=+0.198142138 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3) Oct 14 06:16:13 localhost podman[331317]: 2025-10-14 10:16:13.418554362 +0000 UTC m=+0.249141657 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:16:13 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:16:13 localhost podman[331315]: 2025-10-14 10:16:13.462391511 +0000 UTC m=+0.294208524 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller) Oct 14 06:16:13 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:16:13 localhost nova_compute[297686]: 2025-10-14 10:16:13.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:14.194 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:13Z, description=, device_id=f8c9747b-a127-4e36-a58b-69c927d75a26, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4fa022fe-9c15-4b4e-8855-0836836f91e9, ip_allocation=immediate, mac_address=fa:16:3e:af:c5:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2038, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:13Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:14 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:16:14 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:14 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:14 localhost podman[331395]: 2025-10-14 10:16:14.445596983 +0000 UTC m=+0.049049510 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0) Oct 14 06:16:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:14.499 2 INFO neutron.agent.securitygroups_rpc [None req-4e3d39ad-b026-4772-bd9e-9dbb5b15581b 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:14.760 271987 INFO neutron.agent.dhcp.agent [None req-adb4dccf-c175-4484-90c3-89c30c11959f - - - - - -] DHCP configuration for ports {'4fa022fe-9c15-4b4e-8855-0836836f91e9'} is completed#033[00m Oct 14 06:16:15 localhost ovn_controller[157396]: 2025-10-14T10:16:15Z|00177|binding|INFO|Removing iface tapda27e3e0-13 ovn-installed in OVS Oct 14 06:16:15 localhost ovn_controller[157396]: 2025-10-14T10:16:15Z|00178|binding|INFO|Removing lport da27e3e0-13a0-4974-aa0d-030ac455a157 ovn-installed in OVS Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.162 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 30d598bf-6e00-47d6-9772-7f349adc0947 with type ""#033[00m Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.166 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-1df4417c-0ac5-4fd2-bddd-03b727bc48b3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1df4417c-0ac5-4fd2-bddd-03b727bc48b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ca5e1d577fe463aa89a13e320c6dd5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4b6a919-b709-4bb9-9806-dcfc94f77b17, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da27e3e0-13a0-4974-aa0d-030ac455a157) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.167 163055 INFO neutron.agent.ovn.metadata.agent [-] Port da27e3e0-13a0-4974-aa0d-030ac455a157 in datapath 1df4417c-0ac5-4fd2-bddd-03b727bc48b3 unbound from our chassis#033[00m Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.171 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1df4417c-0ac5-4fd2-bddd-03b727bc48b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.172 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c2240b-0376-4c08-8588-bb213c0de5cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:15 localhost nova_compute[297686]: 2025-10-14 10:16:15.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:15 localhost nova_compute[297686]: 2025-10-14 10:16:15.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:15 localhost podman[331434]: 2025-10-14 10:16:15.281136122 +0000 UTC m=+0.048836033 container kill fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:16:15 localhost dnsmasq[331240]: exiting on receipt of SIGTERM Oct 14 06:16:15 localhost systemd[1]: libpod-fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f.scope: Deactivated successfully. Oct 14 06:16:15 localhost podman[331449]: 2025-10-14 10:16:15.35405029 +0000 UTC m=+0.056489910 container died fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:16:15 localhost systemd[1]: tmp-crun.m5DavE.mount: Deactivated successfully. Oct 14 06:16:15 localhost podman[331449]: 2025-10-14 10:16:15.398800006 +0000 UTC m=+0.101239576 container cleanup fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:16:15 localhost systemd[1]: libpod-conmon-fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f.scope: Deactivated successfully. Oct 14 06:16:15 localhost podman[331450]: 2025-10-14 10:16:15.440538009 +0000 UTC m=+0.136498919 container remove fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1df4417c-0ac5-4fd2-bddd-03b727bc48b3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:16:15 localhost nova_compute[297686]: 2025-10-14 10:16:15.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:15 localhost kernel: device tapda27e3e0-13 left promiscuous mode Oct 14 06:16:15 localhost nova_compute[297686]: 2025-10-14 10:16:15.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:15.489 271987 INFO neutron.agent.dhcp.agent [None req-743c6502-1246-4b94-9d82-8486207e486c - - - - - -] Synchronizing state#033[00m Oct 14 06:16:15 localhost nova_compute[297686]: 2025-10-14 10:16:15.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.675 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:15 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:15.676 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:16:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:15.872 271987 INFO neutron.agent.dhcp.agent [None req-5ce02df7-327c-4a01-9b4a-02a59e26b994 - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:16:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:15.873 271987 INFO neutron.agent.dhcp.agent [-] Starting network 1df4417c-0ac5-4fd2-bddd-03b727bc48b3 dhcp configuration#033[00m Oct 14 06:16:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:15.874 271987 INFO neutron.agent.dhcp.agent [-] Finished network 1df4417c-0ac5-4fd2-bddd-03b727bc48b3 dhcp configuration#033[00m Oct 14 06:16:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:15.874 271987 INFO neutron.agent.dhcp.agent [None req-5ce02df7-327c-4a01-9b4a-02a59e26b994 - - - - - -] Synchronizing state complete#033[00m Oct 14 06:16:15 localhost ovn_controller[157396]: 2025-10-14T10:16:15Z|00179|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:15 localhost nova_compute[297686]: 2025-10-14 10:16:15.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:16.075 271987 INFO neutron.agent.dhcp.agent [None req-3a67295f-238b-4aa3-b1bb-e3e17093efe3 - - - - - -] DHCP configuration for ports {'fc9a85cf-3953-4532-a6d5-45df55798050'} is completed#033[00m Oct 14 06:16:16 localhost systemd[1]: var-lib-containers-storage-overlay-38a9048af44fc3ec298294d1acc77e4f9544621724d603cdfe91ed6211abff87-merged.mount: Deactivated successfully. Oct 14 06:16:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fee8dbc26bdd5969adcde9fecdbe46207baa99be5805e25b06a409a785e1419f-userdata-shm.mount: Deactivated successfully. Oct 14 06:16:16 localhost systemd[1]: run-netns-qdhcp\x2d1df4417c\x2d0ac5\x2d4fd2\x2dbddd\x2d03b727bc48b3.mount: Deactivated successfully. Oct 14 06:16:18 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:16:18 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:18 localhost podman[331494]: 2025-10-14 10:16:18.294517866 +0000 UTC m=+0.051640961 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:16:18 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:18 localhost systemd[1]: tmp-crun.DFUxjq.mount: Deactivated successfully. Oct 14 06:16:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:18.410 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:18.414 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:16:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:18.421 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:18.422 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[db4a5ee2-ca6b-4e87-800f-aa682eaa3893]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:18 localhost nova_compute[297686]: 2025-10-14 10:16:18.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:20 localhost nova_compute[297686]: 2025-10-14 10:16:20.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:20 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:20.674 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:19Z, description=, device_id=1c451acf-4b7e-4704-a2f8-81f5f0c7da77, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2a20d5b7-3c41-4c31-b8ad-80d4ac7849a3, ip_allocation=immediate, mac_address=fa:16:3e:b0:ff:a9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2070, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:20Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:20.829 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:20.832 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:16:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:20.837 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:20 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:20.839 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[ec33397c-4913-40dd-9df0-4d70b28d64e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:20 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:16:20 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:20 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:20 localhost podman[331533]: 2025-10-14 10:16:20.880764229 +0000 UTC m=+0.069690980 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:16:20 localhost systemd[1]: tmp-crun.kRwT8c.mount: Deactivated successfully. Oct 14 06:16:21 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:21.157 271987 INFO neutron.agent.dhcp.agent [None req-fde7273d-75e9-4f40-8436-d233d077d45b - - - - - -] DHCP configuration for ports {'2a20d5b7-3c41-4c31-b8ad-80d4ac7849a3'} is completed#033[00m Oct 14 06:16:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:21.678 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:16:23 localhost nova_compute[297686]: 2025-10-14 10:16:23.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:23 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:23.719 2 INFO neutron.agent.securitygroups_rpc [None req-71ba91b5-684f-48d7-9c14-69fcb9a93319 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:24.003 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:23Z, description=, device_id=79b7f41d-abf6-47ac-b611-6aadc3e7cc62, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7f456452-ab1e-4ddb-8cc9-55657971b49c, ip_allocation=immediate, mac_address=fa:16:3e:7a:05:fe, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2099, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:23Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:24 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:16:24 localhost podman[331572]: 2025-10-14 10:16:24.194426082 +0000 UTC m=+0.042801436 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:16:24 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:24 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:24 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:24.443 271987 INFO neutron.agent.dhcp.agent [None req-47b0c7c9-923c-49e3-980b-74aa65e96c6b - - - - - -] DHCP configuration for ports {'7f456452-ab1e-4ddb-8cc9-55657971b49c'} is completed#033[00m Oct 14 06:16:24 localhost nova_compute[297686]: 2025-10-14 10:16:24.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:16:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:16:24 localhost systemd[1]: tmp-crun.q8TIGn.mount: Deactivated successfully. Oct 14 06:16:24 localhost nova_compute[297686]: 2025-10-14 10:16:24.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:24 localhost podman[331593]: 2025-10-14 10:16:24.7978016 +0000 UTC m=+0.135953121 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:16:24 localhost podman[331593]: 2025-10-14 10:16:24.809291176 +0000 UTC m=+0.147442657 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009) Oct 14 06:16:24 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:16:24 localhost podman[331592]: 2025-10-14 10:16:24.773913321 +0000 UTC m=+0.115176759 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:16:24 localhost podman[331592]: 2025-10-14 10:16:24.86010112 +0000 UTC m=+0.201364558 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:16:24 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:16:25 localhost nova_compute[297686]: 2025-10-14 10:16:25.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:25 localhost nova_compute[297686]: 2025-10-14 10:16:25.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:25.573 2 INFO neutron.agent.securitygroups_rpc [None req-c5aa3340-9fd7-4904-bb68-292ec0b04a2a f13b53fbf22a4c35bd774e0276dc1885 c1b284821e574367bb6352caf7327da5 - - default default] Security group member updated ['c10bbd65-342a-46b6-95d2-96fbac5e8435']#033[00m Oct 14 06:16:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:25.693 2 INFO neutron.agent.securitygroups_rpc [None req-9dacf8d0-f231-40b3-b91a-c67bf08625fb 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:25 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:16:26 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:26.322 2 INFO neutron.agent.securitygroups_rpc [None req-c5aa3340-9fd7-4904-bb68-292ec0b04a2a f13b53fbf22a4c35bd774e0276dc1885 c1b284821e574367bb6352caf7327da5 - - default default] Security group member updated ['c10bbd65-342a-46b6-95d2-96fbac5e8435']#033[00m Oct 14 06:16:26 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:26.557 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:26Z, description=, device_id=8dfa4a0c-62d8-4b31-b0c7-aa2ed8514011, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e964f1c0-f2d1-4bc5-9fe7-77ab943cf6a6, ip_allocation=immediate, mac_address=fa:16:3e:2c:40:93, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2103, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:26Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:26 localhost systemd[1]: tmp-crun.ZVYgkd.mount: Deactivated successfully. Oct 14 06:16:26 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:16:26 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:26 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:26 localhost podman[331737]: 2025-10-14 10:16:26.776475846 +0000 UTC m=+0.070540427 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:16:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:16:27 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:27.007 271987 INFO neutron.agent.dhcp.agent [None req-60950066-87a6-47db-a1de-667a48d72088 - - - - - -] DHCP configuration for ports {'e964f1c0-f2d1-4bc5-9fe7-77ab943cf6a6'} is completed#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.580 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.580 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.580 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:16:27 localhost nova_compute[297686]: 2025-10-14 10:16:27.581 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:16:28 localhost nova_compute[297686]: 2025-10-14 10:16:28.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:28 localhost podman[248187]: time="2025-10-14T10:16:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:16:28 localhost sshd[331759]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:16:28 localhost podman[248187]: @ - - [14/Oct/2025:10:16:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151145 "" "Go-http-client/1.1" Oct 14 06:16:28 localhost podman[248187]: @ - - [14/Oct/2025:10:16:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20814 "" "Go-http-client/1.1" Oct 14 06:16:28 localhost nova_compute[297686]: 2025-10-14 10:16:28.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:29 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:29.177 2 INFO neutron.agent.securitygroups_rpc [None req-c4c00a8c-e700-4f72-9a8b-3220edd3eb7f f13b53fbf22a4c35bd774e0276dc1885 c1b284821e574367bb6352caf7327da5 - - default default] Security group member updated ['c10bbd65-342a-46b6-95d2-96fbac5e8435']#033[00m Oct 14 06:16:29 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:29.338 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.264 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.284 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.285 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.286 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.287 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.288 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.289 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.289 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:30 localhost nova_compute[297686]: 2025-10-14 10:16:30.290 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 06:16:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:16:30 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:30.769 2 INFO neutron.agent.securitygroups_rpc [None req-e7264db5-7468-4a06-8b47-30dfce2eff1c f13b53fbf22a4c35bd774e0276dc1885 c1b284821e574367bb6352caf7327da5 - - default default] Security group member updated ['c10bbd65-342a-46b6-95d2-96fbac5e8435']#033[00m Oct 14 06:16:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:16:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:16:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:16:31 localhost podman[331775]: 2025-10-14 10:16:31.744435237 +0000 UTC m=+0.079531164 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:16:31 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:16:31 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:31 localhost podman[331801]: 2025-10-14 10:16:31.779430011 +0000 UTC m=+0.063350894 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:16:31 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:31 localhost sshd[331829]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:16:31 localhost podman[331776]: 2025-10-14 10:16:31.826030424 +0000 UTC m=+0.154444595 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:16:31 localhost podman[331775]: 2025-10-14 10:16:31.832162673 +0000 UTC m=+0.167258650 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:16:31 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:16:31 localhost podman[331776]: 2025-10-14 10:16:31.863196405 +0000 UTC m=+0.191610576 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:16:31 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:16:31 localhost podman[331774]: 2025-10-14 10:16:31.905904778 +0000 UTC m=+0.241831862 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:16:31 localhost podman[331774]: 2025-10-14 10:16:31.916431173 +0000 UTC m=+0.252358267 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:16:31 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:16:32 localhost nova_compute[297686]: 2025-10-14 10:16:32.311 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:32 localhost nova_compute[297686]: 2025-10-14 10:16:32.311 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:32 localhost systemd[1]: tmp-crun.7dMoIO.mount: Deactivated successfully. Oct 14 06:16:33 localhost nova_compute[297686]: 2025-10-14 10:16:33.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:33 localhost nova_compute[297686]: 2025-10-14 10:16:33.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:33 localhost nova_compute[297686]: 2025-10-14 10:16:33.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:33 localhost ovn_controller[157396]: 2025-10-14T10:16:33Z|00180|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:33 localhost nova_compute[297686]: 2025-10-14 10:16:33.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:16:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2164 writes, 22K keys, 2164 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.06 MB/s#012Cumulative WAL: 2164 writes, 2164 syncs, 1.00 writes per sync, written: 0.03 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2164 writes, 22K keys, 2164 commit groups, 1.0 writes per commit group, ingest: 35.68 MB, 0.06 MB/s#012Interval WAL: 2164 writes, 2164 syncs, 1.00 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 190.4 0.13 0.06 8 0.017 0 0 0.0 0.0#012 L6 1/0 14.41 MB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 4.6 213.0 194.7 0.61 0.32 7 0.087 90K 3417 0.0 0.0#012 Sum 1/0 14.41 MB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 5.6 174.5 193.9 0.74 0.38 15 0.049 90K 3417 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 5.6 175.2 194.7 0.74 0.38 14 0.053 90K 3417 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 0.0 213.0 194.7 0.61 0.32 7 0.087 90K 3417 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 194.8 0.13 0.06 7 0.019 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.025, interval 0.025#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.24 MB/s write, 0.13 GB read, 0.22 MB/s read, 0.7 seconds#012Interval compaction: 0.14 GB write, 0.24 MB/s write, 0.13 GB read, 0.22 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c2d5d93350#2 capacity: 308.00 MB usage: 11.85 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 9.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(526,11.26 MB,3.65564%) FilterBlock(15,266.30 KB,0.0844336%) IndexBlock(15,338.52 KB,0.107332%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Oct 14 06:16:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:34 localhost nova_compute[297686]: 2025-10-14 10:16:34.454 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:16:34 localhost nova_compute[297686]: 2025-10-14 10:16:34.454 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:16:34 localhost nova_compute[297686]: 2025-10-14 10:16:34.455 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:16:34 localhost nova_compute[297686]: 2025-10-14 10:16:34.455 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:16:34 localhost nova_compute[297686]: 2025-10-14 10:16:34.456 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:16:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:16:34 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2640468034' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:16:34 localhost nova_compute[297686]: 2025-10-14 10:16:34.916 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.000 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.001 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.221 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.222 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11233MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.223 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.223 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:16:35 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/955387673' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:16:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:16:35 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/955387673' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.581 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.582 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:16:35 localhost nova_compute[297686]: 2025-10-14 10:16:35.582 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.081 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.355 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.355 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.379 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.422 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.468 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:16:36 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:36.735 271987 INFO neutron.agent.linux.ip_lib [None req-9f6d147c-f819-4b70-8f34-8911ff04a778 - - - - - -] Device tap9f966a39-b1 cannot be used as it has no MAC address#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:36 localhost kernel: device tap9f966a39-b1 entered promiscuous mode Oct 14 06:16:36 localhost NetworkManager[5977]: [1760436996.8112] manager: (tap9f966a39-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:36 localhost ovn_controller[157396]: 2025-10-14T10:16:36Z|00181|binding|INFO|Claiming lport 9f966a39-b1ce-460f-a5a4-a5c5f80bc734 for this chassis. Oct 14 06:16:36 localhost ovn_controller[157396]: 2025-10-14T10:16:36Z|00182|binding|INFO|9f966a39-b1ce-460f-a5a4-a5c5f80bc734: Claiming unknown Oct 14 06:16:36 localhost systemd-udevd[331911]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:16:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:36.823 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-9387e2f1-60ce-4498-8771-400624ac9c85', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9387e2f1-60ce-4498-8771-400624ac9c85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ca5e1d577fe463aa89a13e320c6dd5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77ece7cd-4593-4f08-abb1-3cdeb3872c1c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9f966a39-b1ce-460f-a5a4-a5c5f80bc734) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:36.824 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 9f966a39-b1ce-460f-a5a4-a5c5f80bc734 in datapath 9387e2f1-60ce-4498-8771-400624ac9c85 bound to our chassis#033[00m Oct 14 06:16:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:36.828 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 00c9b832-2a59-4c07-b01d-7030f0103d88 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:16:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:36.828 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9387e2f1-60ce-4498-8771-400624ac9c85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:36.829 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[4d32cbb9-0ba8-4662-85a2-f3c34bd8ad78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:36 localhost ovn_controller[157396]: 2025-10-14T10:16:36Z|00183|binding|INFO|Setting lport 9f966a39-b1ce-460f-a5a4-a5c5f80bc734 ovn-installed in OVS Oct 14 06:16:36 localhost ovn_controller[157396]: 2025-10-14T10:16:36Z|00184|binding|INFO|Setting lport 9f966a39-b1ce-460f-a5a4-a5c5f80bc734 up in Southbound Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:36 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:16:36 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/467890996' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.922 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.930 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.948 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.951 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.952 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.953 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.953 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 06:16:36 localhost nova_compute[297686]: 2025-10-14 10:16:36.974 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 06:16:37 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:37.492 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 00c9b832-2a59-4c07-b01d-7030f0103d88 with type ""#033[00m Oct 14 06:16:37 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:37.493 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-9387e2f1-60ce-4498-8771-400624ac9c85', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9387e2f1-60ce-4498-8771-400624ac9c85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ca5e1d577fe463aa89a13e320c6dd5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77ece7cd-4593-4f08-abb1-3cdeb3872c1c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9f966a39-b1ce-460f-a5a4-a5c5f80bc734) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:37 localhost ovn_controller[157396]: 2025-10-14T10:16:37Z|00185|binding|INFO|Removing iface tap9f966a39-b1 ovn-installed in OVS Oct 14 06:16:37 localhost ovn_controller[157396]: 2025-10-14T10:16:37Z|00186|binding|INFO|Removing lport 9f966a39-b1ce-460f-a5a4-a5c5f80bc734 ovn-installed in OVS Oct 14 06:16:37 localhost nova_compute[297686]: 2025-10-14 10:16:37.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:37 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:37.497 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 9f966a39-b1ce-460f-a5a4-a5c5f80bc734 in datapath 9387e2f1-60ce-4498-8771-400624ac9c85 unbound from our chassis#033[00m Oct 14 06:16:37 localhost nova_compute[297686]: 2025-10-14 10:16:37.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:37 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:37.502 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9387e2f1-60ce-4498-8771-400624ac9c85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:37 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:37.503 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[826be9df-bd0d-4396-9240-5f76d38f8a16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:37 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:16:37 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:37 localhost podman[331963]: 2025-10-14 10:16:37.634940992 +0000 UTC m=+0.053591351 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Oct 14 06:16:37 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:37 localhost podman[332004]: Oct 14 06:16:37 localhost podman[332004]: 2025-10-14 10:16:37.798945641 +0000 UTC m=+0.063667352 container create 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:16:37 localhost systemd[1]: Started libpod-conmon-4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243.scope. Oct 14 06:16:37 localhost systemd[1]: Started libcrun container. Oct 14 06:16:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/324f84cb74d5345345b15f55868efb50ed6b856ae5ff40d523684dc453598c68/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:16:37 localhost podman[332004]: 2025-10-14 10:16:37.868421703 +0000 UTC m=+0.133143424 container init 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:16:37 localhost podman[332004]: 2025-10-14 10:16:37.769701876 +0000 UTC m=+0.034423647 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:16:37 localhost podman[332004]: 2025-10-14 10:16:37.878417223 +0000 UTC m=+0.143138974 container start 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:16:37 localhost dnsmasq[332025]: started, version 2.85 cachesize 150 Oct 14 06:16:37 localhost dnsmasq[332025]: DNS service limited to local subnets Oct 14 06:16:37 localhost dnsmasq[332025]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:16:37 localhost dnsmasq[332025]: warning: no upstream servers configured Oct 14 06:16:37 localhost dnsmasq-dhcp[332025]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:16:37 localhost dnsmasq[332025]: read /var/lib/neutron/dhcp/9387e2f1-60ce-4498-8771-400624ac9c85/addn_hosts - 0 addresses Oct 14 06:16:37 localhost dnsmasq-dhcp[332025]: read /var/lib/neutron/dhcp/9387e2f1-60ce-4498-8771-400624ac9c85/host Oct 14 06:16:37 localhost dnsmasq-dhcp[332025]: read /var/lib/neutron/dhcp/9387e2f1-60ce-4498-8771-400624ac9c85/opts Oct 14 06:16:37 localhost nova_compute[297686]: 2025-10-14 10:16:37.975 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:37 localhost nova_compute[297686]: 2025-10-14 10:16:37.976 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:37 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:37.995 271987 INFO neutron.agent.dhcp.agent [None req-a5f50197-b851-4997-a906-9aee59692389 - - - - - -] DHCP configuration for ports {'e104dde8-c385-4c99-921b-9f029d33137a'} is completed#033[00m Oct 14 06:16:38 localhost dnsmasq[332025]: read /var/lib/neutron/dhcp/9387e2f1-60ce-4498-8771-400624ac9c85/addn_hosts - 0 addresses Oct 14 06:16:38 localhost dnsmasq-dhcp[332025]: read /var/lib/neutron/dhcp/9387e2f1-60ce-4498-8771-400624ac9c85/host Oct 14 06:16:38 localhost dnsmasq-dhcp[332025]: read /var/lib/neutron/dhcp/9387e2f1-60ce-4498-8771-400624ac9c85/opts Oct 14 06:16:38 localhost podman[332041]: 2025-10-14 10:16:38.188631951 +0000 UTC m=+0.055662854 container kill 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:16:38 localhost ovn_controller[157396]: 2025-10-14T10:16:38Z|00187|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:38 localhost nova_compute[297686]: 2025-10-14 10:16:38.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:38 localhost dnsmasq[332025]: exiting on receipt of SIGTERM Oct 14 06:16:38 localhost podman[332076]: 2025-10-14 10:16:38.594296686 +0000 UTC m=+0.066632535 container kill 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:16:38 localhost systemd[1]: libpod-4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243.scope: Deactivated successfully. Oct 14 06:16:38 localhost nova_compute[297686]: 2025-10-14 10:16:38.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:38 localhost podman[332092]: 2025-10-14 10:16:38.673968344 +0000 UTC m=+0.057680878 container died 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:16:38 localhost podman[332092]: 2025-10-14 10:16:38.720192085 +0000 UTC m=+0.103904579 container remove 4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9387e2f1-60ce-4498-8771-400624ac9c85, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:16:38 localhost systemd[1]: libpod-conmon-4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243.scope: Deactivated successfully. Oct 14 06:16:38 localhost nova_compute[297686]: 2025-10-14 10:16:38.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:38 localhost kernel: device tap9f966a39-b1 left promiscuous mode Oct 14 06:16:38 localhost nova_compute[297686]: 2025-10-14 10:16:38.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:38 localhost openstack_network_exporter[250374]: ERROR 10:16:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:16:38 localhost openstack_network_exporter[250374]: ERROR 10:16:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:16:38 localhost openstack_network_exporter[250374]: ERROR 10:16:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:16:38 localhost openstack_network_exporter[250374]: ERROR 10:16:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:16:38 localhost openstack_network_exporter[250374]: Oct 14 06:16:38 localhost openstack_network_exporter[250374]: ERROR 10:16:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:16:38 localhost openstack_network_exporter[250374]: Oct 14 06:16:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:38.784 271987 INFO neutron.agent.dhcp.agent [None req-4a1a517a-e990-4e62-8ebe-998ed6ce6811 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:38.785 271987 INFO neutron.agent.dhcp.agent [None req-4a1a517a-e990-4e62-8ebe-998ed6ce6811 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:38 localhost systemd[1]: var-lib-containers-storage-overlay-324f84cb74d5345345b15f55868efb50ed6b856ae5ff40d523684dc453598c68-merged.mount: Deactivated successfully. Oct 14 06:16:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d228318ec3bfe122d1edde9cdc72400eba23f5c02a98b582bb727a855709243-userdata-shm.mount: Deactivated successfully. Oct 14 06:16:38 localhost systemd[1]: run-netns-qdhcp\x2d9387e2f1\x2d60ce\x2d4498\x2d8771\x2d400624ac9c85.mount: Deactivated successfully. Oct 14 06:16:38 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:38.850 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 2001:db8::f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 10.100.0.2 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:38 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:38.851 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:16:38 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:38.854 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:38 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:38.855 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[399879de-bd8b-4fca-a8dc-fa4a5710fa43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:38 localhost ovn_controller[157396]: 2025-10-14T10:16:38Z|00188|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:38 localhost nova_compute[297686]: 2025-10-14 10:16:38.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:40.753 271987 INFO neutron.agent.linux.ip_lib [None req-80fb0eb3-52ee-4d7d-b7f1-81fbf20d7869 - - - - - -] Device tap12b93085-f2 cannot be used as it has no MAC address#033[00m Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost kernel: device tap12b93085-f2 entered promiscuous mode Oct 14 06:16:40 localhost NetworkManager[5977]: [1760437000.7804] manager: (tap12b93085-f2): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Oct 14 06:16:40 localhost ovn_controller[157396]: 2025-10-14T10:16:40Z|00189|binding|INFO|Claiming lport 12b93085-f28b-40a4-a1de-e1b0bdcbf988 for this chassis. Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost ovn_controller[157396]: 2025-10-14T10:16:40Z|00190|binding|INFO|12b93085-f28b-40a4-a1de-e1b0bdcbf988: Claiming unknown Oct 14 06:16:40 localhost systemd-udevd[332148]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:16:40 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:40.793 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-85199d23-7ccc-452a-9609-9618b87eb30e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85199d23-7ccc-452a-9609-9618b87eb30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1b284821e574367bb6352caf7327da5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=630d99f6-7d2f-4678-9659-1c47823048c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=12b93085-f28b-40a4-a1de-e1b0bdcbf988) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:40 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:40.795 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 12b93085-f28b-40a4-a1de-e1b0bdcbf988 in datapath 85199d23-7ccc-452a-9609-9618b87eb30e bound to our chassis#033[00m Oct 14 06:16:40 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:40.796 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 85199d23-7ccc-452a-9609-9618b87eb30e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:16:40 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:40.797 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f9fc22-ac12-4eb1-a6c6-1ba5b6fb37e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:40 localhost ovn_controller[157396]: 2025-10-14T10:16:40Z|00191|binding|INFO|Setting lport 12b93085-f28b-40a4-a1de-e1b0bdcbf988 ovn-installed in OVS Oct 14 06:16:40 localhost ovn_controller[157396]: 2025-10-14T10:16:40Z|00192|binding|INFO|Setting lport 12b93085-f28b-40a4-a1de-e1b0bdcbf988 up in Southbound Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost journal[237477]: ethtool ioctl error on tap12b93085-f2: No such device Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost dnsmasq[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/addn_hosts - 0 addresses Oct 14 06:16:40 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/host Oct 14 06:16:40 localhost podman[332140]: 2025-10-14 10:16:40.86646036 +0000 UTC m=+0.087711857 container kill b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:16:40 localhost dnsmasq-dhcp[331101]: read /var/lib/neutron/dhcp/61834488-97da-47c3-8374-b9fea3c2b7e5/opts Oct 14 06:16:40 localhost nova_compute[297686]: 2025-10-14 10:16:40.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:40 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:40.987 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:40Z, description=, device_id=98f2bc2f-f96b-4fd5-9cfe-13c9bcec3a8c, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6f4933ef-d79d-4c6a-ba4c-54777c1ddb39, ip_allocation=immediate, mac_address=fa:16:3e:e3:31:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2134, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:40Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:41 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:41.021 2 INFO neutron.agent.securitygroups_rpc [None req-0a5df930-2184-4323-aca9-39b249cdad65 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:41 localhost ovn_controller[157396]: 2025-10-14T10:16:41Z|00193|binding|INFO|Releasing lport 7a9cfa63-0bc5-4880-b1be-364e29d25876 from this chassis (sb_readonly=0) Oct 14 06:16:41 localhost ovn_controller[157396]: 2025-10-14T10:16:41Z|00194|binding|INFO|Setting lport 7a9cfa63-0bc5-4880-b1be-364e29d25876 down in Southbound Oct 14 06:16:41 localhost kernel: device tap7a9cfa63-0b left promiscuous mode Oct 14 06:16:41 localhost nova_compute[297686]: 2025-10-14 10:16:41.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:41.042 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-61834488-97da-47c3-8374-b9fea3c2b7e5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61834488-97da-47c3-8374-b9fea3c2b7e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bf1be3a6a454996a4414fad306906f1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b1ab247-44e5-4002-a4db-e97bbb68024f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7a9cfa63-0bc5-4880-b1be-364e29d25876) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:41.043 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 7a9cfa63-0bc5-4880-b1be-364e29d25876 in datapath 61834488-97da-47c3-8374-b9fea3c2b7e5 unbound from our chassis#033[00m Oct 14 06:16:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:41.045 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61834488-97da-47c3-8374-b9fea3c2b7e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:41 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:41.046 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[99df8ff3-e675-4fee-a5cd-74f694e3ae72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:41 localhost nova_compute[297686]: 2025-10-14 10:16:41.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:41 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:16:41 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:41 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:41 localhost podman[332219]: 2025-10-14 10:16:41.164714899 +0000 UTC m=+0.034680986 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:16:41 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:41.463 271987 INFO neutron.agent.dhcp.agent [None req-49ba85f9-8cfd-4dd0-a893-41f1bf5941fa - - - - - -] DHCP configuration for ports {'6f4933ef-d79d-4c6a-ba4c-54777c1ddb39'} is completed#033[00m Oct 14 06:16:41 localhost dnsmasq[331101]: exiting on receipt of SIGTERM Oct 14 06:16:41 localhost systemd[1]: tmp-crun.YUnkS8.mount: Deactivated successfully. Oct 14 06:16:41 localhost podman[332283]: 2025-10-14 10:16:41.742438632 +0000 UTC m=+0.109442481 container kill b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:16:41 localhost systemd[1]: libpod-b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113.scope: Deactivated successfully. Oct 14 06:16:41 localhost podman[332299]: Oct 14 06:16:41 localhost podman[332299]: 2025-10-14 10:16:41.783023629 +0000 UTC m=+0.087155311 container create f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-85199d23-7ccc-452a-9609-9618b87eb30e, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:16:41 localhost podman[332314]: 2025-10-14 10:16:41.813523593 +0000 UTC m=+0.050821025 container died b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:16:41 localhost systemd[1]: Started libpod-conmon-f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8.scope. Oct 14 06:16:41 localhost podman[332299]: 2025-10-14 10:16:41.728116299 +0000 UTC m=+0.032248011 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:16:41 localhost systemd[1]: Started libcrun container. Oct 14 06:16:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa2718c439f8e7de4e63295d40210892869846b702c5ff447ce1ded7f6aec563/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:16:41 localhost podman[332314]: 2025-10-14 10:16:41.857774464 +0000 UTC m=+0.095071846 container remove b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61834488-97da-47c3-8374-b9fea3c2b7e5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:16:41 localhost systemd[1]: libpod-conmon-b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113.scope: Deactivated successfully. Oct 14 06:16:41 localhost podman[332299]: 2025-10-14 10:16:41.900317432 +0000 UTC m=+0.204449064 container init f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-85199d23-7ccc-452a-9609-9618b87eb30e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 06:16:41 localhost podman[332299]: 2025-10-14 10:16:41.906324398 +0000 UTC m=+0.210456030 container start f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-85199d23-7ccc-452a-9609-9618b87eb30e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:16:41 localhost dnsmasq[332346]: started, version 2.85 cachesize 150 Oct 14 06:16:41 localhost dnsmasq[332346]: DNS service limited to local subnets Oct 14 06:16:41 localhost dnsmasq[332346]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:16:41 localhost dnsmasq[332346]: warning: no upstream servers configured Oct 14 06:16:41 localhost dnsmasq-dhcp[332346]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:16:41 localhost dnsmasq[332346]: read /var/lib/neutron/dhcp/85199d23-7ccc-452a-9609-9618b87eb30e/addn_hosts - 0 addresses Oct 14 06:16:41 localhost dnsmasq-dhcp[332346]: read /var/lib/neutron/dhcp/85199d23-7ccc-452a-9609-9618b87eb30e/host Oct 14 06:16:41 localhost dnsmasq-dhcp[332346]: read /var/lib/neutron/dhcp/85199d23-7ccc-452a-9609-9618b87eb30e/opts Oct 14 06:16:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:42.095 271987 INFO neutron.agent.dhcp.agent [None req-8fd52e11-f2be-4fc3-b598-20b4241fddb1 - - - - - -] DHCP configuration for ports {'54f8b41b-aa9a-4a03-a96f-1866dd5b9ff7'} is completed#033[00m Oct 14 06:16:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:42.152 271987 INFO neutron.agent.dhcp.agent [None req-6c3c8ddd-c59d-471c-b560-2b77cc989472 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:42 localhost dnsmasq[332346]: exiting on receipt of SIGTERM Oct 14 06:16:42 localhost podman[332362]: 2025-10-14 10:16:42.215536466 +0000 UTC m=+0.033057975 container kill f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-85199d23-7ccc-452a-9609-9618b87eb30e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:16:42 localhost systemd[1]: libpod-f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8.scope: Deactivated successfully. Oct 14 06:16:42 localhost podman[332377]: 2025-10-14 10:16:42.264990818 +0000 UTC m=+0.033121277 container died f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-85199d23-7ccc-452a-9609-9618b87eb30e, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:16:42 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:42.281 2 INFO neutron.agent.securitygroups_rpc [None req-5c76f8fb-dc30-4efd-ac5a-17d64524fe3e 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:42 localhost podman[332377]: 2025-10-14 10:16:42.302133718 +0000 UTC m=+0.070264157 container remove f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-85199d23-7ccc-452a-9609-9618b87eb30e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:16:42 localhost ovn_controller[157396]: 2025-10-14T10:16:42Z|00195|binding|INFO|Releasing lport 12b93085-f28b-40a4-a1de-e1b0bdcbf988 from this chassis (sb_readonly=0) Oct 14 06:16:42 localhost ovn_controller[157396]: 2025-10-14T10:16:42Z|00196|binding|INFO|Setting lport 12b93085-f28b-40a4-a1de-e1b0bdcbf988 down in Southbound Oct 14 06:16:42 localhost kernel: device tap12b93085-f2 left promiscuous mode Oct 14 06:16:42 localhost nova_compute[297686]: 2025-10-14 10:16:42.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:42 localhost nova_compute[297686]: 2025-10-14 10:16:42.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:42.330 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-85199d23-7ccc-452a-9609-9618b87eb30e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85199d23-7ccc-452a-9609-9618b87eb30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1b284821e574367bb6352caf7327da5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=630d99f6-7d2f-4678-9659-1c47823048c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=12b93085-f28b-40a4-a1de-e1b0bdcbf988) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:42.331 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 12b93085-f28b-40a4-a1de-e1b0bdcbf988 in datapath 85199d23-7ccc-452a-9609-9618b87eb30e unbound from our chassis#033[00m Oct 14 06:16:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:42.333 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 85199d23-7ccc-452a-9609-9618b87eb30e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:16:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:42.333 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[8f560de2-3faf-4f19-a39f-b0554374272b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:42 localhost systemd[1]: libpod-conmon-f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8.scope: Deactivated successfully. Oct 14 06:16:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:42.396 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:42.600 271987 INFO neutron.agent.dhcp.agent [None req-cc113db2-d32e-499b-9c51-9cf27a72fb13 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:42 localhost systemd[1]: var-lib-containers-storage-overlay-aa2718c439f8e7de4e63295d40210892869846b702c5ff447ce1ded7f6aec563-merged.mount: Deactivated successfully. Oct 14 06:16:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f03446df9777f406f2e5d4bc98c12b7c550868daff46bfa320cbb8eed3aaf9c8-userdata-shm.mount: Deactivated successfully. Oct 14 06:16:42 localhost systemd[1]: run-netns-qdhcp\x2d85199d23\x2d7ccc\x2d452a\x2d9609\x2d9618b87eb30e.mount: Deactivated successfully. Oct 14 06:16:42 localhost systemd[1]: var-lib-containers-storage-overlay-0b5b7fe8b1bc557af7c5bfe195bb44ff3a8bacccf93d39f1e928d229dad49f0d-merged.mount: Deactivated successfully. Oct 14 06:16:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4f79c6c0696cadc08af71d02cf1da3031c5b335900e291967d1a4f682c13113-userdata-shm.mount: Deactivated successfully. Oct 14 06:16:42 localhost systemd[1]: run-netns-qdhcp\x2d61834488\x2d97da\x2d47c3\x2d8374\x2db9fea3c2b7e5.mount: Deactivated successfully. Oct 14 06:16:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:43.078 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:16:43 localhost nova_compute[297686]: 2025-10-14 10:16:43.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:16:43 localhost podman[332406]: 2025-10-14 10:16:43.741280503 +0000 UTC m=+0.082956881 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:16:43 localhost podman[332406]: 2025-10-14 10:16:43.811248319 +0000 UTC m=+0.152924647 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 06:16:43 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:16:43 localhost podman[332418]: 2025-10-14 10:16:43.825815471 +0000 UTC m=+0.134450105 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public) Oct 14 06:16:43 localhost podman[332418]: 2025-10-14 10:16:43.909157642 +0000 UTC m=+0.217792366 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:16:43 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:16:43 localhost podman[332419]: 2025-10-14 10:16:43.996053843 +0000 UTC m=+0.295248855 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:16:44 localhost podman[332419]: 2025-10-14 10:16:44.007632343 +0000 UTC m=+0.306827345 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Oct 14 06:16:44 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:16:44 localhost systemd-journald[47488]: Data hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Oct 14 06:16:44 localhost systemd-journald[47488]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating. Oct 14 06:16:44 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 06:16:44 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Oct 14 06:16:44 localhost ovn_controller[157396]: 2025-10-14T10:16:44Z|00197|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:44.248 2 INFO neutron.agent.securitygroups_rpc [None req-14a4c602-5a85-4c5c-8a40-437ac242b70f 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:44 localhost nova_compute[297686]: 2025-10-14 10:16:44.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:44.726 2 INFO neutron.agent.securitygroups_rpc [None req-5a983a26-4b0e-471b-bc37-40bd24908aed 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:45 localhost nova_compute[297686]: 2025-10-14 10:16:45.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:16:45 localhost nova_compute[297686]: 2025-10-14 10:16:45.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:45.356 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:47.926 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:48 localhost ovn_controller[157396]: 2025-10-14T10:16:48Z|00198|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:48 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:16:48 localhost podman[332487]: 2025-10-14 10:16:48.244460979 +0000 UTC m=+0.051827997 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:16:48 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:48 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:48 localhost nova_compute[297686]: 2025-10-14 10:16:48.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:48 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:48.501 2 INFO neutron.agent.securitygroups_rpc [None req-8e3d5cd7-e138-4dba-9b9e-24fb34a7ff8a 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:48 localhost nova_compute[297686]: 2025-10-14 10:16:48.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:16:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3437966422' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:16:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:16:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3437966422' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:16:49 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:49.363 2 INFO neutron.agent.securitygroups_rpc [None req-fb81f250-1cad-43a0-86a6-1fe40cd0df92 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:49 localhost podman[332525]: 2025-10-14 10:16:49.636933638 +0000 UTC m=+0.064896801 container kill 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:16:49 localhost dnsmasq[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/addn_hosts - 0 addresses Oct 14 06:16:49 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/host Oct 14 06:16:49 localhost dnsmasq-dhcp[330782]: read /var/lib/neutron/dhcp/5d8fe93a-c65a-4669-ba1e-66d52ee61c6a/opts Oct 14 06:16:49 localhost systemd[1]: tmp-crun.bsrHFQ.mount: Deactivated successfully. Oct 14 06:16:49 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:49.670 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:49Z, description=, device_id=2e4988ae-ab62-4ac4-9479-73fd6a40d2e8, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=88a7a1ba-684d-4a2c-8649-0d048b7c3280, ip_allocation=immediate, mac_address=fa:16:3e:ab:d3:4c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2154, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:49Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.823 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.824 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:16:49 localhost ovn_controller[157396]: 2025-10-14T10:16:49Z|00199|binding|INFO|Releasing lport 1df01de3-c11c-40e3-8802-ea137fe51f0c from this chassis (sb_readonly=0) Oct 14 06:16:49 localhost ovn_controller[157396]: 2025-10-14T10:16:49Z|00200|binding|INFO|Setting lport 1df01de3-c11c-40e3-8802-ea137fe51f0c down in Southbound Oct 14 06:16:49 localhost kernel: device tap1df01de3-c1 left promiscuous mode Oct 14 06:16:49 localhost nova_compute[297686]: 2025-10-14 10:16:49.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:49 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:49.881 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bf1be3a6a454996a4414fad306906f1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64127a4b-0baf-4336-8658-a60a67ebf24c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1df01de3-c11c-40e3-8802-ea137fe51f0c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:49 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:49.882 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 1df01de3-c11c-40e3-8802-ea137fe51f0c in datapath 5d8fe93a-c65a-4669-ba1e-66d52ee61c6a unbound from our chassis#033[00m Oct 14 06:16:49 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:49.884 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.885 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 16860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:49.885 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa0c1e6-5955-4a0b-a1f3-b981f60cc73c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7112d09-302d-45f1-8251-5b74d2a6dcbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16860000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:16:49.824661', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e5bde54a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.077942105, 'message_signature': '4015844b6e5fff2a4430ad325abfc8c141571b2e6c261b7731c57e3e2a2cc201'}]}, 'timestamp': '2025-10-14 10:16:49.886049', '_unique_id': '1942cad1c20d44468d6b3e48715131f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.887 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.888 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:16:49 localhost nova_compute[297686]: 2025-10-14 10:16:49.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2dc1afd-0e77-4cca-b355-f6c5f30d1f87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.888975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c1c6ec-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': '3bd2d6e7fa8450522a3005c80ab5b6e881b7c2ccb8417f56635a43b63c4b16a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.888975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c1d682-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': 'c831a8ef47f7dd65ce4a5b5c3c11cfc4631a55766077368d9f98dd15fd57f639'}]}, 'timestamp': '2025-10-14 10:16:49.911784', '_unique_id': '7478d063b5484c6ebc861ece9ea3d0fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c481249e-a276-4045-b05d-fadbfa126a2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.914204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c2445a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': '6fbe29293327111d9c1ac61864c0b3cf329f02b9ded352d6b8b4ef964f3d2b56'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.914204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c25058-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': '64c726000e8f29ca63f7317b14b45e56f6479ecc0e394b92f6b3734cc31a563d'}]}, 'timestamp': '2025-10-14 10:16:49.914828', '_unique_id': '849978bf5d894c799a127331c3b035ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.915 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f8c0e0e-8db4-461b-aa10-1dee5d7e82f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.916353', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c2edf6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '197832f616ffac32d27933ae230a28617b19615b620ab7038c266a01ca3a380f'}]}, 'timestamp': '2025-10-14 10:16:49.918883', '_unique_id': '91e42875cbd14b44b964136be49ec3ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.919 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.920 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84957e31-afb2-4238-bf82-dea307990d88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.920369', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c33450-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': 'e85d2c4f86e40e7b4614a89c12cc2ebe03f7ac49d7861c5664b7def1df451cbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.920369', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c3401c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': 'c312f9bf329c45e775d27e0954087cc427690c49b00ec3034b77882dfdd96584'}]}, 'timestamp': '2025-10-14 10:16:49.920958', '_unique_id': '34bd972f02d342498bc1d8c519babe1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.921 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.922 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '070bf992-12bc-405d-8c8a-39952bfe3f4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.922448', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c385cc-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '7789109370f868d47994b49029747d229be43c2c7e4a55ee63d4bd154ef58451'}]}, 'timestamp': '2025-10-14 10:16:49.922790', '_unique_id': '0834f3c038894d7e8923c38b40b87548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bae42d19-656a-46db-b7f0-d3b84fcc13f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.924278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c3cdde-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': '6f877b9addca705d241fe5dea5d67048baab684e277cdc0a48dbbc0df0e709fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.924278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c3dde2-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': '632a4876e0650395b96ea43e6ff203989997c74964a19535a7971131a49b398c'}]}, 'timestamp': '2025-10-14 10:16:49.925058', '_unique_id': 'ab38219b55664a6b927118c305316a97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.925 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.927 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0c78d4f-69e0-4b31-926c-838c5b477f9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.927168', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c44200-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '22575f0dbda6f079ce04144004bf9d671e60befd9918b35676bb05e889928c6e'}]}, 'timestamp': '2025-10-14 10:16:49.927661', '_unique_id': 'b81dd9f6f51b42a6a5db4b19c7c27250'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ac30130-5dbb-4673-9226-e6c587af8e09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.929791', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c4a7ea-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '6ea1540f309ff2aefa5f811a476239d504392a78a1f2f625fb00be62211f8e6e'}]}, 'timestamp': '2025-10-14 10:16:49.930264', '_unique_id': 'd1926a2dff8c467e939f00375187e69e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.932 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost systemd[1]: tmp-crun.7rHm4b.mount: Deactivated successfully. Oct 14 06:16:49 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:16:49 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc85565d-95c3-4668-b02a-9c9d0d085953', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.932320', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c50992-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '9324bfbdbc5a0e12a16989c84468c64e42771acaba1aab9da4f8a748a3c339c6'}]}, 'timestamp': '2025-10-14 10:16:49.932784', '_unique_id': 'ec5e558bc0594c83bdddf6af83c03ae5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost podman[332563]: 2025-10-14 10:16:49.934630578 +0000 UTC m=+0.099006637 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.944 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.945 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c4430ea-3f54-44c7-a2a6-287e91bb0e9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.934812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c6fd9c-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.12753, 'message_signature': 'cbcd44b504278fbc246e5a8b606883f24c237482296fd5b770c417459281ee63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.934812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c71200-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.12753, 'message_signature': 'a40173cf2ec696bb3dc4fd1251f69669efcb76c00e77fe2081e2049696f5b72f'}]}, 'timestamp': '2025-10-14 10:16:49.946078', '_unique_id': '87f966573d664cde8dc7a00b61fd5389'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.947 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.948 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ea1525c-51a0-4811-9754-04e257f1e1fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.948449', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c77df8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '2e0653e40b655b2e924d8531a308dddaa0edd8bf1e0b69a9a7f5064e8a9b0cba'}]}, 'timestamp': '2025-10-14 10:16:49.948818', '_unique_id': 'e3a61cf92663453188b7d44be5494f48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.949 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2406a0f3-3337-4279-ad7c-7709b7ced3aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.950560', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c7d1e0-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': '362311d08751ef62b6f63555a783d881597d5b978fbf6e1029d27b092e15e61f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.950560', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c7dd70-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': 'e460a6b86ea700f147526cf01ddef1bf5cc9a6a19ea9e8537d90ce793980f99d'}]}, 'timestamp': '2025-10-14 10:16:49.951268', '_unique_id': 'b15ad042bf1343ca8cc81075ec8d3585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de4de9b5-52bd-4058-af2b-b8f001f87d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.953349', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c83d06-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '563917653fc0b9427c5e7efbed781cf14c13f5279c9818cd36a05582457b4cbc'}]}, 'timestamp': '2025-10-14 10:16:49.953698', '_unique_id': 'e8b90cd48e8f44a8be87d997c2accd1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.955 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b158635-278c-4040-8ccf-37d72785d674', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.955545', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c893b4-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '331dfe106b73f12e1bed61d162bed884214d4743a6ea5b0bfbd59cb48ff338b6'}]}, 'timestamp': '2025-10-14 10:16:49.955892', '_unique_id': '482c8162719c4809a27c198488795c04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.957 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.957 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.957 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e41eaa27-d18b-4b7f-9f4f-d2d069b12a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.957595', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c8e5f8-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.12753, 'message_signature': '0473f2682cae08b2f1f9432860bea72984f35b88c7d2fdc3a077a0dda5696a6b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.957595', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c8f16a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.12753, 'message_signature': '5f24f6b15def98c7c5968cfec94991d2fab7298af2a2e52a953ff9ca8c571add'}]}, 'timestamp': '2025-10-14 10:16:49.958266', '_unique_id': 'e85b931eff884a6d83a73799986bad28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.959 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db3759b2-65b9-4a56-9cbe-07614731e964', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.959721', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5c9358a-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '75d5a88c07b2cbd36b68e918c5d4ff68c3cee604dfae6d515348804103c44bd8'}]}, 'timestamp': '2025-10-14 10:16:49.960030', '_unique_id': '053b27792b7a41799f6f01ab0d041798'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.961 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.961 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.961 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bced197-8891-481e-8a84-53170c6c304f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.961556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5c97ee6-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.12753, 'message_signature': '142268652b11b7ab3d9e3d8af3714bce8f54b859946c10221b6365002977c3ab'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.961556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5c989a4-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.12753, 'message_signature': '28c33bc50fd0da1ad13a7e9792ecf516094b90f1e6fbe38b3fb3193485807610'}]}, 'timestamp': '2025-10-14 10:16:49.962159', '_unique_id': 'b6dcaf38f3b24bc6bf959ef394bb2581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.963 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73937d85-f30f-4187-856d-33ccee66b217', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:16:49.964081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e5c9e03e-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.077942105, 'message_signature': '8fb9db5d56f4aab762f593ccbc2ef748b5a36f02bcb425935e8d294e80c1a378'}]}, 'timestamp': '2025-10-14 10:16:49.964390', '_unique_id': '2869f03057be4dacbe47dcf74d8ca0bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.964 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.965 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2011700-629b-446a-8747-4e490aa7f716', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:16:49.965810', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'e5ca2350-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.109057388, 'message_signature': '24cb0c45ebbf298fb1ea66f4d930e1c83abc00fc25e456e7f8e95e737af0929f'}]}, 'timestamp': '2025-10-14 10:16:49.966116', '_unique_id': '96e32e88b5f14e64b9ffa1bd293ac3f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.966 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.967 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.967 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.967 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5b4217-0de3-470c-a600-cec695380ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:16:49.967495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5ca64f0-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': 'c985aa9048b78465f81fdb8367542f8cf3291c519b816835f8006eea8ceeeaec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:16:49.967495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5ca7080-a8e6-11f0-9707-fa163e99780b', 'monotonic_time': 13026.081697461, 'message_signature': 'ba71172237d9a303927981716a490d078f0b106a3bd0b0f9422f2f1f8c91b113'}]}, 'timestamp': '2025-10-14 10:16:49.968072', '_unique_id': 'ad92211e5801464c9e310240cf0b795b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:16:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:16:49.968 12 ERROR oslo_messaging.notify.messaging Oct 14 06:16:50 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:50.209 271987 INFO neutron.agent.dhcp.agent [None req-c0c525af-9b05-4922-b97e-ba74c1d64b5a - - - - - -] DHCP configuration for ports {'88a7a1ba-684d-4a2c-8649-0d048b7c3280'} is completed#033[00m Oct 14 06:16:50 localhost nova_compute[297686]: 2025-10-14 10:16:50.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:50 localhost dnsmasq[330782]: exiting on receipt of SIGTERM Oct 14 06:16:50 localhost podman[332602]: 2025-10-14 10:16:50.574136286 +0000 UTC m=+0.045288745 container kill 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true) Oct 14 06:16:50 localhost systemd[1]: libpod-89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b.scope: Deactivated successfully. Oct 14 06:16:50 localhost podman[332616]: 2025-10-14 10:16:50.624924619 +0000 UTC m=+0.041633191 container died 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:16:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b-userdata-shm.mount: Deactivated successfully. Oct 14 06:16:50 localhost podman[332616]: 2025-10-14 10:16:50.653556125 +0000 UTC m=+0.070264657 container cleanup 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:16:50 localhost systemd[1]: libpod-conmon-89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b.scope: Deactivated successfully. Oct 14 06:16:50 localhost podman[332618]: 2025-10-14 10:16:50.693401999 +0000 UTC m=+0.104426065 container remove 89dfa1199fae71aa63652c8a36dc3f105ce717d902c728e7fe46816e26b2992b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d8fe93a-c65a-4669-ba1e-66d52ee61c6a, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:16:51 localhost systemd[1]: var-lib-containers-storage-overlay-0e7c7447734ddb5f9fd6704c31b2990fd25a9066ea4c8faf16955ae90df9fd8b-merged.mount: Deactivated successfully. Oct 14 06:16:52 localhost systemd[1]: run-netns-qdhcp\x2d5d8fe93a\x2dc65a\x2d4669\x2dba1e\x2d66d52ee61c6a.mount: Deactivated successfully. Oct 14 06:16:52 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:52.130 271987 INFO neutron.agent.dhcp.agent [None req-7c1590bc-5772-40ac-8860-f7f826c3c2e0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:52 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:52.131 271987 INFO neutron.agent.dhcp.agent [None req-7c1590bc-5772-40ac-8860-f7f826c3c2e0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:52 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:52.869 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:16:53 localhost ovn_controller[157396]: 2025-10-14T10:16:53Z|00201|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:16:53 localhost nova_compute[297686]: 2025-10-14 10:16:53.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:53 localhost nova_compute[297686]: 2025-10-14 10:16:53.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:54.317 2 INFO neutron.agent.securitygroups_rpc [None req-01a27e63-fa41-4775-b8b4-10aba7c88838 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:54 localhost podman[332664]: 2025-10-14 10:16:54.696752014 +0000 UTC m=+0.053218620 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:16:54 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:16:54 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:54 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:54 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:54.837 2 INFO neutron.agent.securitygroups_rpc [None req-1c0d527f-ae60-410e-b28d-7969e7a7b040 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:55 localhost nova_compute[297686]: 2025-10-14 10:16:55.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e134 e134: 6 total, 6 up, 6 in Oct 14 06:16:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:16:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:16:55 localhost podman[332685]: 2025-10-14 10:16:55.738238632 +0000 UTC m=+0.073370093 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 06:16:55 localhost podman[332684]: 2025-10-14 10:16:55.802138911 +0000 UTC m=+0.139044598 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:16:55 localhost podman[332685]: 2025-10-14 10:16:55.822600195 +0000 UTC m=+0.157731616 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Oct 14 06:16:55 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:16:55 localhost podman[332684]: 2025-10-14 10:16:55.838458736 +0000 UTC m=+0.175364423 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:16:55 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:16:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:56.375 2 INFO neutron.agent.securitygroups_rpc [None req-c7cb125f-c945-411f-8038-bc7c7c3f8065 829aecbebfd54f24a9393e430b83d97d 5b0b6727285f4a5bbb8c9712a0e1046a - - default default] Security group rule updated ['475ead66-a9a3-40ac-9223-caee62f16474']#033[00m Oct 14 06:16:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e135 e135: 6 total, 6 up, 6 in Oct 14 06:16:56 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:56.650 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:16:55Z, description=, device_id=419db686-9114-45be-bd67-9d13bdd9e3a0, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fc72c784-e3a8-4720-87b4-84349eca67e1, ip_allocation=immediate, mac_address=fa:16:3e:f7:d1:7f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2178, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:16:56Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:16:56 localhost podman[332744]: 2025-10-14 10:16:56.959730445 +0000 UTC m=+0.045019516 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:16:56 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:16:56 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:16:56 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:16:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:57.203 271987 INFO neutron.agent.dhcp.agent [None req-b1eef56b-eac9-4b05-bfc7-c6a390198e23 - - - - - -] DHCP configuration for ports {'fc72c784-e3a8-4720-87b4-84349eca67e1'} is completed#033[00m Oct 14 06:16:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e136 e136: 6 total, 6 up, 6 in Oct 14 06:16:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:57.785 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:16:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:57.785 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:16:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:57.786 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:16:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:58.244 2 INFO neutron.agent.securitygroups_rpc [None req-51870dd7-af9a-4a53-8ac4-a21928aff9e5 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:58 localhost podman[248187]: time="2025-10-14T10:16:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:16:58 localhost podman[248187]: @ - - [14/Oct/2025:10:16:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:16:58 localhost podman[248187]: @ - - [14/Oct/2025:10:16:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19859 "" "Go-http-client/1.1" Oct 14 06:16:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e137 e137: 6 total, 6 up, 6 in Oct 14 06:16:58 localhost nova_compute[297686]: 2025-10-14 10:16:58.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:58 localhost neutron_sriov_agent[264974]: 2025-10-14 10:16:58.781 2 INFO neutron.agent.securitygroups_rpc [None req-e2de72de-d41d-42ae-a801-e4c63c4ba793 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:16:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:16:59 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:16:59.448 271987 INFO neutron.agent.linux.ip_lib [None req-c815b02e-f429-42ba-b4fd-1bb56ca2a2c6 - - - - - -] Device tapdbfd4483-51 cannot be used as it has no MAC address#033[00m Oct 14 06:16:59 localhost nova_compute[297686]: 2025-10-14 10:16:59.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:59 localhost kernel: device tapdbfd4483-51 entered promiscuous mode Oct 14 06:16:59 localhost nova_compute[297686]: 2025-10-14 10:16:59.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:59 localhost ovn_controller[157396]: 2025-10-14T10:16:59Z|00202|binding|INFO|Claiming lport dbfd4483-51d1-4494-b3b7-3dc214088810 for this chassis. Oct 14 06:16:59 localhost ovn_controller[157396]: 2025-10-14T10:16:59Z|00203|binding|INFO|dbfd4483-51d1-4494-b3b7-3dc214088810: Claiming unknown Oct 14 06:16:59 localhost NetworkManager[5977]: [1760437019.4837] manager: (tapdbfd4483-51): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Oct 14 06:16:59 localhost systemd-udevd[332775]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:16:59 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:59.498 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-3f117742-ee6e-4e4d-b21a-e67620a12d9d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f117742-ee6e-4e4d-b21a-e67620a12d9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc139a195b1a4766b00c4bbfdffdb9e3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=142a08df-27d2-41fb-a032-4b8199e0d461, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dbfd4483-51d1-4494-b3b7-3dc214088810) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:16:59 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:59.500 163055 INFO neutron.agent.ovn.metadata.agent [-] Port dbfd4483-51d1-4494-b3b7-3dc214088810 in datapath 3f117742-ee6e-4e4d-b21a-e67620a12d9d bound to our chassis#033[00m Oct 14 06:16:59 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:59.502 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f117742-ee6e-4e4d-b21a-e67620a12d9d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:16:59 localhost ovn_metadata_agent[163050]: 2025-10-14 10:16:59.503 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[55205cc2-f956-453c-8ee4-3c0f99e4b4d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost ovn_controller[157396]: 2025-10-14T10:16:59Z|00204|binding|INFO|Setting lport dbfd4483-51d1-4494-b3b7-3dc214088810 ovn-installed in OVS Oct 14 06:16:59 localhost ovn_controller[157396]: 2025-10-14T10:16:59Z|00205|binding|INFO|Setting lport dbfd4483-51d1-4494-b3b7-3dc214088810 up in Southbound Oct 14 06:16:59 localhost nova_compute[297686]: 2025-10-14 10:16:59.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost journal[237477]: ethtool ioctl error on tapdbfd4483-51: No such device Oct 14 06:16:59 localhost nova_compute[297686]: 2025-10-14 10:16:59.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:16:59 localhost nova_compute[297686]: 2025-10-14 10:16:59.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:00 localhost nova_compute[297686]: 2025-10-14 10:17:00.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:00 localhost podman[332846]: Oct 14 06:17:00 localhost podman[332846]: 2025-10-14 10:17:00.440121912 +0000 UTC m=+0.060290288 container create c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f117742-ee6e-4e4d-b21a-e67620a12d9d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:17:00 localhost systemd[1]: Started libpod-conmon-c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96.scope. Oct 14 06:17:00 localhost systemd[1]: Started libcrun container. Oct 14 06:17:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2adb2e8eac7a34a5f1b648873fbfc06fc844adda8d9a22f66a885f2ea48315d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:17:00 localhost podman[332846]: 2025-10-14 10:17:00.50528422 +0000 UTC m=+0.125452617 container init c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f117742-ee6e-4e4d-b21a-e67620a12d9d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2) Oct 14 06:17:00 localhost podman[332846]: 2025-10-14 10:17:00.407665187 +0000 UTC m=+0.027833573 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:17:00 localhost podman[332846]: 2025-10-14 10:17:00.516682623 +0000 UTC m=+0.136850989 container start c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f117742-ee6e-4e4d-b21a-e67620a12d9d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:17:00 localhost dnsmasq[332865]: started, version 2.85 cachesize 150 Oct 14 06:17:00 localhost dnsmasq[332865]: DNS service limited to local subnets Oct 14 06:17:00 localhost dnsmasq[332865]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:17:00 localhost dnsmasq[332865]: warning: no upstream servers configured Oct 14 06:17:00 localhost dnsmasq-dhcp[332865]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Oct 14 06:17:00 localhost dnsmasq[332865]: read /var/lib/neutron/dhcp/3f117742-ee6e-4e4d-b21a-e67620a12d9d/addn_hosts - 0 addresses Oct 14 06:17:00 localhost dnsmasq-dhcp[332865]: read /var/lib/neutron/dhcp/3f117742-ee6e-4e4d-b21a-e67620a12d9d/host Oct 14 06:17:00 localhost dnsmasq-dhcp[332865]: read /var/lib/neutron/dhcp/3f117742-ee6e-4e4d-b21a-e67620a12d9d/opts Oct 14 06:17:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:00.679 271987 INFO neutron.agent.dhcp.agent [None req-a0907699-fa21-425f-ac4b-4ed7738443fd - - - - - -] DHCP configuration for ports {'013dda66-1dfa-4f44-b0a4-fc9853999d5f'} is completed#033[00m Oct 14 06:17:00 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:17:00 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:17:00 localhost podman[332883]: 2025-10-14 10:17:00.866638993 +0000 UTC m=+0.063270221 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:17:00 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:17:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Oct 14 06:17:01 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3993231898' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Oct 14 06:17:01 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:01.641 2 INFO neutron.agent.securitygroups_rpc [None req-11f572e0-ac9f-415d-bc0f-f3f90ef6adc7 23c87f3e6fcf4e92b503a3545c69b885 bc139a195b1a4766b00c4bbfdffdb9e3 - - default default] Security group member updated ['a20ff476-7b51-48c6-a80f-bb88f6adeae7']#033[00m Oct 14 06:17:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:01.776 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:17:01Z, description=, device_id=cfe24122-b3fc-4591-8341-3771d2c5b029, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c2609e26-39a6-4996-890b-ed100843ce06, ip_allocation=immediate, mac_address=fa:16:3e:f1:fa:ab, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2218, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:17:01Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:17:01 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:01.871 2 INFO neutron.agent.securitygroups_rpc [None req-0ccc5aba-1212-45b4-a10a-2b1f47c48b5c 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:01.929 271987 INFO neutron.agent.linux.ip_lib [None req-a74ea0e4-b651-4dee-a40c-65de2930264a - - - - - -] Device tapf8b925c9-2f cannot be used as it has no MAC address#033[00m Oct 14 06:17:01 localhost nova_compute[297686]: 2025-10-14 10:17:01.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:01 localhost kernel: device tapf8b925c9-2f entered promiscuous mode Oct 14 06:17:01 localhost nova_compute[297686]: 2025-10-14 10:17:01.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:01 localhost ovn_controller[157396]: 2025-10-14T10:17:01Z|00206|binding|INFO|Claiming lport f8b925c9-2f2c-453b-979b-663ca7ae67af for this chassis. Oct 14 06:17:01 localhost ovn_controller[157396]: 2025-10-14T10:17:01Z|00207|binding|INFO|f8b925c9-2f2c-453b-979b-663ca7ae67af: Claiming unknown Oct 14 06:17:01 localhost NetworkManager[5977]: [1760437021.9634] manager: (tapf8b925c9-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Oct 14 06:17:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:01.977 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe76:f4a7/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2f908ad2-fe64-458f-8a0e-8503bdef5bc9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f908ad2-fe64-458f-8a0e-8503bdef5bc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc139a195b1a4766b00c4bbfdffdb9e3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3eced354-3663-47b9-9bfe-0ecb7e2e6a6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f8b925c9-2f2c-453b-979b-663ca7ae67af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:01.979 163055 INFO neutron.agent.ovn.metadata.agent [-] Port f8b925c9-2f2c-453b-979b-663ca7ae67af in datapath 2f908ad2-fe64-458f-8a0e-8503bdef5bc9 bound to our chassis#033[00m Oct 14 06:17:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:01.982 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port cb1ca13a-aa22-4fc7-b4c1-3436a7949263 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:17:01 localhost nova_compute[297686]: 2025-10-14 10:17:01.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:01.982 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f908ad2-fe64-458f-8a0e-8503bdef5bc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:17:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:01.983 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[338f33d2-2f42-4394-b0ee-b164f48ecc1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:01 localhost ovn_controller[157396]: 2025-10-14T10:17:01Z|00208|binding|INFO|Setting lport f8b925c9-2f2c-453b-979b-663ca7ae67af ovn-installed in OVS Oct 14 06:17:01 localhost ovn_controller[157396]: 2025-10-14T10:17:01Z|00209|binding|INFO|Setting lport f8b925c9-2f2c-453b-979b-663ca7ae67af up in Southbound Oct 14 06:17:01 localhost nova_compute[297686]: 2025-10-14 10:17:01.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:17:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:17:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:17:02 localhost podman[332926]: 2025-10-14 10:17:02.007352493 +0000 UTC m=+0.067685897 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 06:17:02 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:17:02 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:17:02 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:17:02 localhost nova_compute[297686]: 2025-10-14 10:17:02.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:02 localhost nova_compute[297686]: 2025-10-14 10:17:02.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:02 localhost podman[332940]: 2025-10-14 10:17:02.089990272 +0000 UTC m=+0.090118272 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:17:02 localhost nova_compute[297686]: 2025-10-14 10:17:02.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:02 localhost podman[332940]: 2025-10-14 10:17:02.125014407 +0000 UTC m=+0.125142437 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:17:02 localhost podman[332941]: 2025-10-14 10:17:02.139713462 +0000 UTC m=+0.121496323 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid) Oct 14 06:17:02 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:17:02 localhost podman[332941]: 2025-10-14 10:17:02.144358517 +0000 UTC m=+0.126141418 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:17:02 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:17:02 localhost podman[332939]: 2025-10-14 10:17:02.200556247 +0000 UTC m=+0.194556187 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:17:02 localhost podman[332939]: 2025-10-14 10:17:02.21195701 +0000 UTC m=+0.205956920 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Oct 14 06:17:02 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:17:02 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:02.261 271987 INFO neutron.agent.dhcp.agent [None req-04ece16e-38af-4b4c-82ef-e15c084a7df6 - - - - - -] DHCP configuration for ports {'c2609e26-39a6-4996-890b-ed100843ce06'} is completed#033[00m Oct 14 06:17:02 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:02.533 2 INFO neutron.agent.securitygroups_rpc [None req-2ef3d09c-5701-40ee-9138-ed88eafd4701 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e138 e138: 6 total, 6 up, 6 in Oct 14 06:17:02 localhost podman[333061]: Oct 14 06:17:03 localhost podman[333061]: 2025-10-14 10:17:03.004116786 +0000 UTC m=+0.091529346 container create 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:17:03 localhost systemd[1]: Started libpod-conmon-49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853.scope. Oct 14 06:17:03 localhost systemd[1]: tmp-crun.4eZhBC.mount: Deactivated successfully. Oct 14 06:17:03 localhost podman[333061]: 2025-10-14 10:17:02.964104086 +0000 UTC m=+0.051516676 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:17:03 localhost systemd[1]: Started libcrun container. Oct 14 06:17:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2293f1c2583a32be6bd8735068bb8b90779723d2413cf672e9321b3f8c8f290/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:17:03 localhost podman[333061]: 2025-10-14 10:17:03.084592888 +0000 UTC m=+0.172005428 container init 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:17:03 localhost podman[333061]: 2025-10-14 10:17:03.093818574 +0000 UTC m=+0.181231104 container start 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:03 localhost dnsmasq[333079]: started, version 2.85 cachesize 150 Oct 14 06:17:03 localhost dnsmasq[333079]: DNS service limited to local subnets Oct 14 06:17:03 localhost dnsmasq[333079]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:17:03 localhost dnsmasq[333079]: warning: no upstream servers configured Oct 14 06:17:03 localhost dnsmasq[333079]: read /var/lib/neutron/dhcp/2f908ad2-fe64-458f-8a0e-8503bdef5bc9/addn_hosts - 0 addresses Oct 14 06:17:03 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:03.232 271987 INFO neutron.agent.dhcp.agent [None req-cd4f6c36-0e98-436d-802c-01ff2110427d - - - - - -] DHCP configuration for ports {'db8645e5-e5fc-45f7-9ee1-21ee82391e4d'} is completed#033[00m Oct 14 06:17:03 localhost dnsmasq[333079]: exiting on receipt of SIGTERM Oct 14 06:17:03 localhost podman[333096]: 2025-10-14 10:17:03.39038062 +0000 UTC m=+0.050113424 container kill 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:17:03 localhost systemd[1]: libpod-49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853.scope: Deactivated successfully. Oct 14 06:17:03 localhost podman[333109]: 2025-10-14 10:17:03.470036466 +0000 UTC m=+0.065701385 container died 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:17:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853-userdata-shm.mount: Deactivated successfully. Oct 14 06:17:03 localhost podman[333109]: 2025-10-14 10:17:03.507387483 +0000 UTC m=+0.103052352 container cleanup 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:17:03 localhost systemd[1]: libpod-conmon-49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853.scope: Deactivated successfully. Oct 14 06:17:03 localhost podman[333111]: 2025-10-14 10:17:03.548646091 +0000 UTC m=+0.135594090 container remove 49d533029eab0b8c8fe9e3b000b6c320336d2dd9c63e7eeb72cc1b08be02d853 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f908ad2-fe64-458f-8a0e-8503bdef5bc9, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS) Oct 14 06:17:03 localhost nova_compute[297686]: 2025-10-14 10:17:03.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:03 localhost ovn_controller[157396]: 2025-10-14T10:17:03Z|00210|binding|INFO|Releasing lport f8b925c9-2f2c-453b-979b-663ca7ae67af from this chassis (sb_readonly=0) Oct 14 06:17:03 localhost ovn_controller[157396]: 2025-10-14T10:17:03Z|00211|binding|INFO|Setting lport f8b925c9-2f2c-453b-979b-663ca7ae67af down in Southbound Oct 14 06:17:03 localhost kernel: device tapf8b925c9-2f left promiscuous mode Oct 14 06:17:03 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:03.574 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe76:f4a7/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2f908ad2-fe64-458f-8a0e-8503bdef5bc9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f908ad2-fe64-458f-8a0e-8503bdef5bc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc139a195b1a4766b00c4bbfdffdb9e3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3eced354-3663-47b9-9bfe-0ecb7e2e6a6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f8b925c9-2f2c-453b-979b-663ca7ae67af) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:03 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:03.576 163055 INFO neutron.agent.ovn.metadata.agent [-] Port f8b925c9-2f2c-453b-979b-663ca7ae67af in datapath 2f908ad2-fe64-458f-8a0e-8503bdef5bc9 unbound from our chassis#033[00m Oct 14 06:17:03 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:03.579 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f908ad2-fe64-458f-8a0e-8503bdef5bc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:17:03 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:03.580 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[0985ad90-b772-4f7f-bc05-999eaf76b425]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:03 localhost nova_compute[297686]: 2025-10-14 10:17:03.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:03 localhost nova_compute[297686]: 2025-10-14 10:17:03.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:03 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:03.915 271987 INFO neutron.agent.dhcp.agent [None req-a8303c98-d35e-41a6-a9d4-df8c2f5f5b82 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:04 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:17:04 localhost podman[333156]: 2025-10-14 10:17:04.195160305 +0000 UTC m=+0.044787758 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:17:04 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:17:04 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:17:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:04 localhost systemd[1]: var-lib-containers-storage-overlay-e2293f1c2583a32be6bd8735068bb8b90779723d2413cf672e9321b3f8c8f290-merged.mount: Deactivated successfully. Oct 14 06:17:04 localhost systemd[1]: run-netns-qdhcp\x2d2f908ad2\x2dfe64\x2d458f\x2d8a0e\x2d8503bdef5bc9.mount: Deactivated successfully. Oct 14 06:17:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e139 e139: 6 total, 6 up, 6 in Oct 14 06:17:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e140 e140: 6 total, 6 up, 6 in Oct 14 06:17:04 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:04.964 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:17:04Z, description=, device_id=ddc88ef5-4a63-42dc-abdf-966921a9e52b, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d35621b7-0324-47c0-89e9-7e06bc87fd43, ip_allocation=immediate, mac_address=fa:16:3e:c7:4e:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2243, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:17:04Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:17:05 localhost systemd[1]: tmp-crun.sxqcyZ.mount: Deactivated successfully. Oct 14 06:17:05 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:17:05 localhost podman[333195]: 2025-10-14 10:17:05.206408796 +0000 UTC m=+0.060673080 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:17:05 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:17:05 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:17:05 localhost nova_compute[297686]: 2025-10-14 10:17:05.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:05 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:05.464 2 INFO neutron.agent.securitygroups_rpc [None req-575abab5-2783-4770-9528-99d22d30e6e1 23c87f3e6fcf4e92b503a3545c69b885 bc139a195b1a4766b00c4bbfdffdb9e3 - - default default] Security group member updated ['a20ff476-7b51-48c6-a80f-bb88f6adeae7']#033[00m Oct 14 06:17:05 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:05.545 271987 INFO neutron.agent.dhcp.agent [None req-f0480257-07d2-4a06-a941-ecbc9e5ae53c - - - - - -] DHCP configuration for ports {'d35621b7-0324-47c0-89e9-7e06bc87fd43'} is completed#033[00m Oct 14 06:17:06 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:06.639 2 INFO neutron.agent.securitygroups_rpc [None req-6897e8e4-d926-40ba-b8d6-d86b71a1c265 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e141 e141: 6 total, 6 up, 6 in Oct 14 06:17:07 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:07.250 2 INFO neutron.agent.securitygroups_rpc [None req-24eae660-5697-45be-9e02-e6810e5b8e17 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:07 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:17:07 localhost podman[333234]: 2025-10-14 10:17:07.33677185 +0000 UTC m=+0.070385441 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:17:07 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:17:07 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:17:07 localhost systemd[1]: tmp-crun.58SsUS.mount: Deactivated successfully. Oct 14 06:17:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e142 e142: 6 total, 6 up, 6 in Oct 14 06:17:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3696337932' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3696337932' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.599 271987 INFO neutron.agent.dhcp.agent [None req-5ce02df7-327c-4a01-9b4a-02a59e26b994 - - - - - -] Synchronizing state#033[00m Oct 14 06:17:08 localhost nova_compute[297686]: 2025-10-14 10:17:08.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:08 localhost openstack_network_exporter[250374]: ERROR 10:17:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:17:08 localhost openstack_network_exporter[250374]: ERROR 10:17:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:17:08 localhost openstack_network_exporter[250374]: ERROR 10:17:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:17:08 localhost openstack_network_exporter[250374]: ERROR 10:17:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:17:08 localhost openstack_network_exporter[250374]: Oct 14 06:17:08 localhost openstack_network_exporter[250374]: ERROR 10:17:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:17:08 localhost openstack_network_exporter[250374]: Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.960 271987 INFO neutron.agent.dhcp.agent [None req-a330ed12-4c69-43ca-bb15-7f3378a0c737 - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.962 271987 INFO neutron.agent.dhcp.agent [-] Starting network 2f908ad2-fe64-458f-8a0e-8503bdef5bc9 dhcp configuration#033[00m Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.963 271987 INFO neutron.agent.dhcp.agent [-] Finished network 2f908ad2-fe64-458f-8a0e-8503bdef5bc9 dhcp configuration#033[00m Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.963 271987 INFO neutron.agent.dhcp.agent [-] Starting network 8d762092-0d6c-43ae-a63a-3cdc9c02347b dhcp configuration#033[00m Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.964 271987 INFO neutron.agent.dhcp.agent [-] Finished network 8d762092-0d6c-43ae-a63a-3cdc9c02347b dhcp configuration#033[00m Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.964 271987 INFO neutron.agent.dhcp.agent [None req-a330ed12-4c69-43ca-bb15-7f3378a0c737 - - - - - -] Synchronizing state complete#033[00m Oct 14 06:17:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:08.965 271987 INFO neutron.agent.dhcp.agent [None req-95636577-f7b9-4728-bfcb-b7fc4b2809bc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:09.368 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e143 e143: 6 total, 6 up, 6 in Oct 14 06:17:10 localhost nova_compute[297686]: 2025-10-14 10:17:10.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:10 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:10.865 2 INFO neutron.agent.securitygroups_rpc [None req-20821d15-e078-46e7-8405-8376b39a40c2 4c194ea59b244432a9ec5417b8101ebe 5ac8b4aa702a449b8bf4a8039f977fc5 - - default default] Security group rule updated ['8fe43e8a-a14a-430f-ba7d-c6a0fef96a1b']#033[00m Oct 14 06:17:11 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:11.066 2 INFO neutron.agent.securitygroups_rpc [None req-e3edfc3e-0d01-4121-a932-2e3facb8956f 4c194ea59b244432a9ec5417b8101ebe 5ac8b4aa702a449b8bf4a8039f977fc5 - - default default] Security group rule updated ['8fe43e8a-a14a-430f-ba7d-c6a0fef96a1b']#033[00m Oct 14 06:17:11 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:11.250 2 INFO neutron.agent.securitygroups_rpc [None req-0e84d79c-299b-48cc-b1f2-14fae809fee1 89ecba9e60ab4ed4b2a8f801d81075be 0a8ee99608b94600b463f14d4902f3b7 - - default default] Security group member updated ['1e825526-ca45-4d75-b345-f72249726766']#033[00m Oct 14 06:17:12 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:12.217 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:b4:89 2001:db8:0:1:f816:3eff:fe63:b489'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f1f1366-6307-4914-922e-2b4f9757811b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb90059a-750e-43da-ba16-03b3dce8c155) old=Port_Binding(mac=['fa:16:3e:63:b4:89 2001:db8::f816:3eff:fe63:b489'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:b489/64', 'neutron:device_id': 'ovnmeta-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74049e43-4aa7-4318-9233-a58980c3495b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82fc7afce38344ffb7eda3bb0fabdb5b', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:12 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:12.219 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb90059a-750e-43da-ba16-03b3dce8c155 in datapath 74049e43-4aa7-4318-9233-a58980c3495b updated#033[00m Oct 14 06:17:12 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:12.222 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74049e43-4aa7-4318-9233-a58980c3495b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:17:12 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:12.223 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[06f90852-d67f-4f83-814a-b270047a248c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:12 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3030581968' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:12 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3030581968' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:13 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:13.259 2 INFO neutron.agent.securitygroups_rpc [None req-922b1e21-7d02-46c0-8ac9-6ab363b151e5 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:13 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e144 e144: 6 total, 6 up, 6 in Oct 14 06:17:13 localhost nova_compute[297686]: 2025-10-14 10:17:13.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:14.072 2 INFO neutron.agent.securitygroups_rpc [None req-994ba1d8-96d9-429a-b3ec-c8df771bdf05 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e145 e145: 6 total, 6 up, 6 in Oct 14 06:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:17:14 localhost systemd[1]: tmp-crun.8TaUtC.mount: Deactivated successfully. Oct 14 06:17:14 localhost podman[333256]: 2025-10-14 10:17:14.746158449 +0000 UTC m=+0.087103279 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:17:14 localhost podman[333257]: 2025-10-14 10:17:14.781774482 +0000 UTC m=+0.121249356 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3) Oct 14 06:17:14 localhost podman[333257]: 2025-10-14 10:17:14.791128031 +0000 UTC m=+0.130602925 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 06:17:14 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:17:14 localhost podman[333256]: 2025-10-14 10:17:14.810490601 +0000 UTC m=+0.151435401 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:17:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:14.827 271987 INFO neutron.agent.linux.ip_lib [None req-9e7c5bc9-d28d-4fa6-9d8f-3eceda41cfc6 - - - - - -] Device tap0726f454-42 cannot be used as it has no MAC address#033[00m Oct 14 06:17:14 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:17:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e146 e146: 6 total, 6 up, 6 in Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.881 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:14 localhost kernel: device tap0726f454-42 entered promiscuous mode Oct 14 06:17:14 localhost NetworkManager[5977]: [1760437034.8922] manager: (tap0726f454-42): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Oct 14 06:17:14 localhost ovn_controller[157396]: 2025-10-14T10:17:14Z|00212|binding|INFO|Claiming lport 0726f454-42dc-4a01-abe4-189c35f07b40 for this chassis. Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:14 localhost ovn_controller[157396]: 2025-10-14T10:17:14Z|00213|binding|INFO|0726f454-42dc-4a01-abe4-189c35f07b40: Claiming unknown Oct 14 06:17:14 localhost systemd-udevd[333317]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:17:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:14.902 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7d:7764/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2368ec2a-3ef0-4db5-8150-84b9c0201704', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2368ec2a-3ef0-4db5-8150-84b9c0201704', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc139a195b1a4766b00c4bbfdffdb9e3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0659c90-23a7-49ee-8573-2526a6731106, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0726f454-42dc-4a01-abe4-189c35f07b40) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:14.905 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 0726f454-42dc-4a01-abe4-189c35f07b40 in datapath 2368ec2a-3ef0-4db5-8150-84b9c0201704 bound to our chassis#033[00m Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.907 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 06:17:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:14.908 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7ed3bfde-396d-45b8-b48d-d8d40cf5cf73 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.908 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:17:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:14.908 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2368ec2a-3ef0-4db5-8150-84b9c0201704, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:17:14 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:14.908 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[7f67fe25-b251-4601-a502-bfd35a1e29e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.909 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost ovn_controller[157396]: 2025-10-14T10:17:14Z|00214|binding|INFO|Setting lport 0726f454-42dc-4a01-abe4-189c35f07b40 ovn-installed in OVS Oct 14 06:17:14 localhost ovn_controller[157396]: 2025-10-14T10:17:14Z|00215|binding|INFO|Setting lport 0726f454-42dc-4a01-abe4-189c35f07b40 up in Southbound Oct 14 06:17:14 localhost podman[333255]: 2025-10-14 10:17:14.930430976 +0000 UTC m=+0.270026374 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller) Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.943 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost journal[237477]: ethtool ioctl error on tap0726f454-42: No such device Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:14 localhost nova_compute[297686]: 2025-10-14 10:17:14.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:15 localhost podman[333255]: 2025-10-14 10:17:15.001051773 +0000 UTC m=+0.340647161 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller) Oct 14 06:17:15 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:17:15 localhost nova_compute[297686]: 2025-10-14 10:17:15.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:15 localhost podman[333398]: Oct 14 06:17:15 localhost podman[333398]: 2025-10-14 10:17:15.773012454 +0000 UTC m=+0.098383988 container create a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2368ec2a-3ef0-4db5-8150-84b9c0201704, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:17:15 localhost systemd[1]: Started libpod-conmon-a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0.scope. Oct 14 06:17:15 localhost podman[333398]: 2025-10-14 10:17:15.725470931 +0000 UTC m=+0.050842415 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:17:15 localhost systemd[1]: Started libcrun container. Oct 14 06:17:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c47eeb8f346421ac05dc7e8dd286a2150361f271144b7e8a9ca20b0d17e0b7df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:17:15 localhost podman[333398]: 2025-10-14 10:17:15.850773531 +0000 UTC m=+0.176144985 container init a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2368ec2a-3ef0-4db5-8150-84b9c0201704, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:17:15 localhost podman[333398]: 2025-10-14 10:17:15.859831703 +0000 UTC m=+0.185203157 container start a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2368ec2a-3ef0-4db5-8150-84b9c0201704, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:17:15 localhost dnsmasq[333416]: started, version 2.85 cachesize 150 Oct 14 06:17:15 localhost dnsmasq[333416]: DNS service limited to local subnets Oct 14 06:17:15 localhost dnsmasq[333416]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:17:15 localhost dnsmasq[333416]: warning: no upstream servers configured Oct 14 06:17:15 localhost dnsmasq-dhcp[333416]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:17:15 localhost dnsmasq[333416]: read /var/lib/neutron/dhcp/2368ec2a-3ef0-4db5-8150-84b9c0201704/addn_hosts - 0 addresses Oct 14 06:17:15 localhost dnsmasq-dhcp[333416]: read /var/lib/neutron/dhcp/2368ec2a-3ef0-4db5-8150-84b9c0201704/host Oct 14 06:17:15 localhost dnsmasq-dhcp[333416]: read /var/lib/neutron/dhcp/2368ec2a-3ef0-4db5-8150-84b9c0201704/opts Oct 14 06:17:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:15.985 271987 INFO neutron.agent.dhcp.agent [None req-36f20518-aa34-468a-b6d8-31a2bc607351 - - - - - -] DHCP configuration for ports {'3841f300-c7d7-44b6-a328-a2560a316687'} is completed#033[00m Oct 14 06:17:16 localhost podman[333434]: 2025-10-14 10:17:16.21071247 +0000 UTC m=+0.061382492 container kill a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2368ec2a-3ef0-4db5-8150-84b9c0201704, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 06:17:16 localhost dnsmasq[333416]: exiting on receipt of SIGTERM Oct 14 06:17:16 localhost systemd[1]: libpod-a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0.scope: Deactivated successfully. Oct 14 06:17:16 localhost podman[333448]: 2025-10-14 10:17:16.265541789 +0000 UTC m=+0.039589978 container died a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2368ec2a-3ef0-4db5-8150-84b9c0201704, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:16 localhost podman[333448]: 2025-10-14 10:17:16.304722303 +0000 UTC m=+0.078770472 container remove a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2368ec2a-3ef0-4db5-8150-84b9c0201704, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:17:16 localhost nova_compute[297686]: 2025-10-14 10:17:16.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:16 localhost kernel: device tap0726f454-42 left promiscuous mode Oct 14 06:17:16 localhost ovn_controller[157396]: 2025-10-14T10:17:16Z|00216|binding|INFO|Releasing lport 0726f454-42dc-4a01-abe4-189c35f07b40 from this chassis (sb_readonly=0) Oct 14 06:17:16 localhost ovn_controller[157396]: 2025-10-14T10:17:16Z|00217|binding|INFO|Setting lport 0726f454-42dc-4a01-abe4-189c35f07b40 down in Southbound Oct 14 06:17:16 localhost systemd[1]: libpod-conmon-a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0.scope: Deactivated successfully. Oct 14 06:17:16 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:16.358 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7d:7764/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2368ec2a-3ef0-4db5-8150-84b9c0201704', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2368ec2a-3ef0-4db5-8150-84b9c0201704', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc139a195b1a4766b00c4bbfdffdb9e3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0659c90-23a7-49ee-8573-2526a6731106, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0726f454-42dc-4a01-abe4-189c35f07b40) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:16 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:16.360 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 0726f454-42dc-4a01-abe4-189c35f07b40 in datapath 2368ec2a-3ef0-4db5-8150-84b9c0201704 unbound from our chassis#033[00m Oct 14 06:17:16 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:16.361 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2368ec2a-3ef0-4db5-8150-84b9c0201704 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:17:16 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:16.361 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[f799fe24-bb68-49d6-9bb2-3b4d0b62576c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:16 localhost nova_compute[297686]: 2025-10-14 10:17:16.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:16 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e147 e147: 6 total, 6 up, 6 in Oct 14 06:17:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:16.561 271987 INFO neutron.agent.dhcp.agent [None req-0e438470-afff-4d34-bee1-4eaf26a38173 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:16.562 271987 INFO neutron.agent.dhcp.agent [None req-0e438470-afff-4d34-bee1-4eaf26a38173 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:16 localhost nova_compute[297686]: 2025-10-14 10:17:16.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:16 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:16.572 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:16 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:16.572 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:17:16 localhost systemd[1]: var-lib-containers-storage-overlay-c47eeb8f346421ac05dc7e8dd286a2150361f271144b7e8a9ca20b0d17e0b7df-merged.mount: Deactivated successfully. Oct 14 06:17:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a15797aca0a26355715591ab5cb2d5ef3213e4fc81a115f2c28b17d107b2dfc0-userdata-shm.mount: Deactivated successfully. Oct 14 06:17:16 localhost systemd[1]: run-netns-qdhcp\x2d2368ec2a\x2d3ef0\x2d4db5\x2d8150\x2d84b9c0201704.mount: Deactivated successfully. Oct 14 06:17:16 localhost ovn_controller[157396]: 2025-10-14T10:17:16Z|00218|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:17:16 localhost nova_compute[297686]: 2025-10-14 10:17:16.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:17 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:17.159 2 INFO neutron.agent.securitygroups_rpc [None req-2f4a518e-a759-4d76-98c3-9a8a58ba34a2 89ecba9e60ab4ed4b2a8f801d81075be 0a8ee99608b94600b463f14d4902f3b7 - - default default] Security group member updated ['1e825526-ca45-4d75-b345-f72249726766']#033[00m Oct 14 06:17:17 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:17.496 2 INFO neutron.agent.securitygroups_rpc [None req-350544ed-e804-411b-b4c0-7c1dc91acd03 23c87f3e6fcf4e92b503a3545c69b885 bc139a195b1a4766b00c4bbfdffdb9e3 - - default default] Security group member updated ['a20ff476-7b51-48c6-a80f-bb88f6adeae7']#033[00m Oct 14 06:17:17 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e148 e148: 6 total, 6 up, 6 in Oct 14 06:17:18 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:18.396 2 INFO neutron.agent.securitygroups_rpc [None req-781d2756-8989-48c7-8197-f77dae3ec589 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:18 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:18.554 2 INFO neutron.agent.securitygroups_rpc [req-53d0fc0c-66d1-42d3-89fc-c2ce28d642ac req-d20a620d-b8b9-4f7d-adb0-dd2566f4346e 4c194ea59b244432a9ec5417b8101ebe 5ac8b4aa702a449b8bf4a8039f977fc5 - - default default] Security group member updated ['8fe43e8a-a14a-430f-ba7d-c6a0fef96a1b']#033[00m Oct 14 06:17:18 localhost nova_compute[297686]: 2025-10-14 10:17:18.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:19.191 2 INFO neutron.agent.securitygroups_rpc [None req-b7e6bbed-7f3f-4a3c-80c3-50a29e36be94 23c87f3e6fcf4e92b503a3545c69b885 bc139a195b1a4766b00c4bbfdffdb9e3 - - default default] Security group member updated ['a20ff476-7b51-48c6-a80f-bb88f6adeae7']#033[00m Oct 14 06:17:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:19.219 2 INFO neutron.agent.securitygroups_rpc [None req-130ebb04-cb2a-433a-974b-eb9d9513b4eb 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:19.235 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:19.277 2 INFO neutron.agent.securitygroups_rpc [None req-ed13e3b4-7a5d-4979-8f87-6fc73b0d5f68 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:19 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:19.574 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:17:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:19.938 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:19.989 2 INFO neutron.agent.securitygroups_rpc [None req-2ec4d1c4-e5f7-4c7a-9600-8dd539f820f9 73c3910059834cd0998d3459c50cd69d 82fc7afce38344ffb7eda3bb0fabdb5b - - default default] Security group member updated ['10f25aec-a6f2-40dd-837d-8812e1c0ebb8']#033[00m Oct 14 06:17:20 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:20.063 2 INFO neutron.agent.securitygroups_rpc [None req-13b9aea2-710f-4d66-97e3-a7afd3b9cd17 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:20 localhost nova_compute[297686]: 2025-10-14 10:17:20.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e149 e149: 6 total, 6 up, 6 in Oct 14 06:17:21 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:21.169 2 INFO neutron.agent.securitygroups_rpc [None req-761d8405-b4ba-472e-86ed-7284e6cf7df2 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:21 localhost dnsmasq[332865]: exiting on receipt of SIGTERM Oct 14 06:17:21 localhost podman[333492]: 2025-10-14 10:17:21.525825354 +0000 UTC m=+0.063254550 container kill c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f117742-ee6e-4e4d-b21a-e67620a12d9d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:17:21 localhost systemd[1]: libpod-c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96.scope: Deactivated successfully. Oct 14 06:17:21 localhost podman[333506]: 2025-10-14 10:17:21.606012878 +0000 UTC m=+0.057821563 container died c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f117742-ee6e-4e4d-b21a-e67620a12d9d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:17:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96-userdata-shm.mount: Deactivated successfully. Oct 14 06:17:21 localhost systemd[1]: var-lib-containers-storage-overlay-f2adb2e8eac7a34a5f1b648873fbfc06fc844adda8d9a22f66a885f2ea48315d-merged.mount: Deactivated successfully. Oct 14 06:17:21 localhost podman[333506]: 2025-10-14 10:17:21.648881245 +0000 UTC m=+0.100689880 container remove c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f117742-ee6e-4e4d-b21a-e67620a12d9d, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:17:21 localhost systemd[1]: libpod-conmon-c154b95220c5573694845f19551c76518a3d0e2cd70e89ccefd8420f7fae0b96.scope: Deactivated successfully. Oct 14 06:17:21 localhost nova_compute[297686]: 2025-10-14 10:17:21.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:21 localhost ovn_controller[157396]: 2025-10-14T10:17:21Z|00219|binding|INFO|Releasing lport dbfd4483-51d1-4494-b3b7-3dc214088810 from this chassis (sb_readonly=0) Oct 14 06:17:21 localhost ovn_controller[157396]: 2025-10-14T10:17:21Z|00220|binding|INFO|Setting lport dbfd4483-51d1-4494-b3b7-3dc214088810 down in Southbound Oct 14 06:17:21 localhost kernel: device tapdbfd4483-51 left promiscuous mode Oct 14 06:17:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:21.677 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-3f117742-ee6e-4e4d-b21a-e67620a12d9d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f117742-ee6e-4e4d-b21a-e67620a12d9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc139a195b1a4766b00c4bbfdffdb9e3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=142a08df-27d2-41fb-a032-4b8199e0d461, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dbfd4483-51d1-4494-b3b7-3dc214088810) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:21.680 163055 INFO neutron.agent.ovn.metadata.agent [-] Port dbfd4483-51d1-4494-b3b7-3dc214088810 in datapath 3f117742-ee6e-4e4d-b21a-e67620a12d9d unbound from our chassis#033[00m Oct 14 06:17:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:21.682 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f117742-ee6e-4e4d-b21a-e67620a12d9d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:17:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:21.683 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[025450bc-d186-45ef-b660-5f8b1830b856]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:21 localhost nova_compute[297686]: 2025-10-14 10:17:21.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:21 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:21.865 271987 INFO neutron.agent.dhcp.agent [None req-66720a25-2b4b-40f7-b6cc-838c2f76ef42 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:21 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:21.865 271987 INFO neutron.agent.dhcp.agent [None req-66720a25-2b4b-40f7-b6cc-838c2f76ef42 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:21 localhost systemd[1]: run-netns-qdhcp\x2d3f117742\x2dee6e\x2d4e4d\x2db21a\x2de67620a12d9d.mount: Deactivated successfully. Oct 14 06:17:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:22.081 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:22 localhost ovn_controller[157396]: 2025-10-14T10:17:22Z|00221|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:17:22 localhost nova_compute[297686]: 2025-10-14 10:17:22.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:22 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:22.574 2 INFO neutron.agent.securitygroups_rpc [None req-3b1fd47a-b385-45e6-a18b-613582c496ea 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:23 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:23.033 2 INFO neutron.agent.securitygroups_rpc [None req-25897af1-4107-422e-ad6d-8107d81487b2 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:23 localhost nova_compute[297686]: 2025-10-14 10:17:23.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:24 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:24.641 2 INFO neutron.agent.securitygroups_rpc [None req-2ea822f2-eb65-4bfd-b6c2-8220108a0ad2 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e150 e150: 6 total, 6 up, 6 in Oct 14 06:17:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:25.131 2 INFO neutron.agent.securitygroups_rpc [None req-3095d98a-70fd-4677-894b-50b46d66e5f2 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:25 localhost nova_compute[297686]: 2025-10-14 10:17:25.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:25.578 2 INFO neutron.agent.securitygroups_rpc [None req-fd2f30ab-7af5-4689-be38-54c72b0865c4 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:17:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:17:26 localhost podman[333550]: 2025-10-14 10:17:26.267465296 +0000 UTC m=+0.079565056 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:26 localhost podman[333550]: 2025-10-14 10:17:26.273778291 +0000 UTC m=+0.085878091 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Oct 14 06:17:26 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:17:26 localhost systemd[1]: tmp-crun.fCBmYw.mount: Deactivated successfully. Oct 14 06:17:26 localhost podman[333549]: 2025-10-14 10:17:26.335287586 +0000 UTC m=+0.152811813 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:17:26 localhost podman[333549]: 2025-10-14 10:17:26.373120739 +0000 UTC m=+0.190644936 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:17:26 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:17:27 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:27.047 2 INFO neutron.agent.securitygroups_rpc [None req-d9ed0561-18b0-464c-bcbd-f3234b4677b2 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.284 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.284 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.285 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:17:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.970 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.970 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.970 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:17:27 localhost nova_compute[297686]: 2025-10-14 10:17:27.971 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:17:28 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:28.057 2 INFO neutron.agent.securitygroups_rpc [None req-621b59c3-21d0-4a62-9715-c2a3ba627dfc 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:28 localhost podman[248187]: time="2025-10-14T10:17:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:17:28 localhost podman[248187]: @ - - [14/Oct/2025:10:17:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:17:28 localhost podman[248187]: @ - - [14/Oct/2025:10:17:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19873 "" "Go-http-client/1.1" Oct 14 06:17:28 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:28.578 2 INFO neutron.agent.securitygroups_rpc [None req-2fe7dd40-e5e5-426d-9c90-91a47b056eee 10b55ef66b7942fbb887281b08c1c2c4 64a4f7cc952f4010aeadd1288d8b2d40 - - default default] Security group member updated ['82f65abf-851e-40c1-af7d-0dc1d45ee116']#033[00m Oct 14 06:17:28 localhost nova_compute[297686]: 2025-10-14 10:17:28.620 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:17:28 localhost nova_compute[297686]: 2025-10-14 10:17:28.644 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:17:28 localhost nova_compute[297686]: 2025-10-14 10:17:28.645 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:17:28 localhost nova_compute[297686]: 2025-10-14 10:17:28.645 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:28 localhost nova_compute[297686]: 2025-10-14 10:17:28.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:17:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e151 e151: 6 total, 6 up, 6 in Oct 14 06:17:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:29 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1673699193' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:29 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1673699193' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:17:30 localhost nova_compute[297686]: 2025-10-14 10:17:30.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:30 localhost nova_compute[297686]: 2025-10-14 10:17:30.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:30 localhost nova_compute[297686]: 2025-10-14 10:17:30.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:30 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/524971037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:30 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/524971037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:31 localhost nova_compute[297686]: 2025-10-14 10:17:31.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:31 localhost nova_compute[297686]: 2025-10-14 10:17:31.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:17:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:17:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:17:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:17:32 localhost podman[333714]: 2025-10-14 10:17:32.753879307 +0000 UTC m=+0.090916216 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd) Oct 14 06:17:32 localhost podman[333715]: 2025-10-14 10:17:32.803740762 +0000 UTC m=+0.140424561 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:17:32 localhost podman[333714]: 2025-10-14 10:17:32.818728136 +0000 UTC m=+0.155765055 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:32 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:17:32 localhost podman[333715]: 2025-10-14 10:17:32.839455858 +0000 UTC m=+0.176139647 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:17:32 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:17:32 localhost podman[333716]: 2025-10-14 10:17:32.914907265 +0000 UTC m=+0.244769493 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:17:32 localhost podman[333716]: 2025-10-14 10:17:32.930160027 +0000 UTC m=+0.260022235 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible) Oct 14 06:17:32 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:17:33 localhost nova_compute[297686]: 2025-10-14 10:17:33.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Oct 14 06:17:34 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1454941788' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Oct 14 06:17:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e152 e152: 6 total, 6 up, 6 in Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.275 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.275 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.275 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.275 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.276 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:17:35 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2112118603' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.717 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.809 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.810 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:17:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e153 e153: 6 total, 6 up, 6 in Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.996 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.998 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11233MB free_disk=41.77419662475586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.998 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:17:35 localhost nova_compute[297686]: 2025-10-14 10:17:35.998 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.096 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.097 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.097 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.161 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:17:36 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:17:36 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/648500355' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.550 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.555 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.568 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.586 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:17:36 localhost nova_compute[297686]: 2025-10-14 10:17:36.586 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:17:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e154 e154: 6 total, 6 up, 6 in Oct 14 06:17:37 localhost nova_compute[297686]: 2025-10-14 10:17:37.587 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:17:37 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:37.681 2 INFO neutron.agent.securitygroups_rpc [None req-c53ba612-2959-49b9-907f-50100eb8726b 2e7cd4bda92349ddb9cbf7425b92390f d9d0afbea79e447cb971eaabb8beabe0 - - default default] Security group member updated ['1c1b1ebb-7217-404a-a5ad-52e80abb7fe1']#033[00m Oct 14 06:17:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:38.107 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:38 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:38.499 2 INFO neutron.agent.securitygroups_rpc [None req-35368cb8-158c-4436-9630-caf5139e6dbf 2e7cd4bda92349ddb9cbf7425b92390f d9d0afbea79e447cb971eaabb8beabe0 - - default default] Security group member updated ['1c1b1ebb-7217-404a-a5ad-52e80abb7fe1']#033[00m Oct 14 06:17:38 localhost openstack_network_exporter[250374]: ERROR 10:17:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:17:38 localhost openstack_network_exporter[250374]: ERROR 10:17:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:17:38 localhost openstack_network_exporter[250374]: ERROR 10:17:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:17:38 localhost openstack_network_exporter[250374]: ERROR 10:17:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:17:38 localhost openstack_network_exporter[250374]: Oct 14 06:17:38 localhost openstack_network_exporter[250374]: ERROR 10:17:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:17:38 localhost openstack_network_exporter[250374]: Oct 14 06:17:38 localhost nova_compute[297686]: 2025-10-14 10:17:38.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:38 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:38.959 2 INFO neutron.agent.securitygroups_rpc [None req-365d8d66-363a-4a0b-a787-22dccf61b748 2e7cd4bda92349ddb9cbf7425b92390f d9d0afbea79e447cb971eaabb8beabe0 - - default default] Security group member updated ['1c1b1ebb-7217-404a-a5ad-52e80abb7fe1']#033[00m Oct 14 06:17:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e155 e155: 6 total, 6 up, 6 in Oct 14 06:17:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:39 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:39.844 2 INFO neutron.agent.securitygroups_rpc [None req-25b80d9e-e1da-4936-8be0-6b0a4ba835b7 2e7cd4bda92349ddb9cbf7425b92390f d9d0afbea79e447cb971eaabb8beabe0 - - default default] Security group member updated ['1c1b1ebb-7217-404a-a5ad-52e80abb7fe1']#033[00m Oct 14 06:17:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:40 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3215842217' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:40 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3215842217' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:40 localhost nova_compute[297686]: 2025-10-14 10:17:40.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e156 e156: 6 total, 6 up, 6 in Oct 14 06:17:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:43 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3088225111' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:43 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3088225111' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:43.205 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:17:42Z, description=, device_id=69cdd940-7f67-4d17-9238-32b2d9fba8b8, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6a671cb9-0c3d-4377-b39f-c64e10e37f4d, ip_allocation=immediate, mac_address=fa:16:3e:94:c9:80, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2460, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:17:42Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:17:43 localhost systemd[1]: tmp-crun.DqvicK.mount: Deactivated successfully. Oct 14 06:17:43 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:17:43 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:17:43 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:17:43 localhost podman[333835]: 2025-10-14 10:17:43.440836892 +0000 UTC m=+0.072886638 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 06:17:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:43.703 271987 INFO neutron.agent.dhcp.agent [None req-c4298e00-fc1e-4161-9e47-92bdb5351a9a - - - - - -] DHCP configuration for ports {'6a671cb9-0c3d-4377-b39f-c64e10e37f4d'} is completed#033[00m Oct 14 06:17:43 localhost nova_compute[297686]: 2025-10-14 10:17:43.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:44 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1269762711' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:44 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1269762711' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:44.546 2 INFO neutron.agent.securitygroups_rpc [None req-e35969f1-ada6-4389-8aa2-3aee82b24fd0 4abcf2207306448e9582b15f96b7ebff 3ea6a4a53034479f90ec8161c8b6ce29 - - default default] Security group member updated ['f8556b9e-ea71-4aa8-9e6b-de955a348819']#033[00m Oct 14 06:17:44 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:44.797 2 INFO neutron.agent.securitygroups_rpc [None req-e35969f1-ada6-4389-8aa2-3aee82b24fd0 4abcf2207306448e9582b15f96b7ebff 3ea6a4a53034479f90ec8161c8b6ce29 - - default default] Security group member updated ['f8556b9e-ea71-4aa8-9e6b-de955a348819']#033[00m Oct 14 06:17:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e157 e157: 6 total, 6 up, 6 in Oct 14 06:17:45 localhost nova_compute[297686]: 2025-10-14 10:17:45.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:45 localhost nova_compute[297686]: 2025-10-14 10:17:45.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:17:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:17:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:17:45 localhost podman[333855]: 2025-10-14 10:17:45.760033484 +0000 UTC m=+0.090627818 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009) Oct 14 06:17:45 localhost podman[333855]: 2025-10-14 10:17:45.802012864 +0000 UTC m=+0.132607128 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller) Oct 14 06:17:45 localhost podman[333857]: 2025-10-14 10:17:45.818768443 +0000 UTC m=+0.143226997 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:17:45 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:17:45 localhost podman[333857]: 2025-10-14 10:17:45.831037813 +0000 UTC m=+0.155496367 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:17:45 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:17:45 localhost podman[333856]: 2025-10-14 10:17:45.877015257 +0000 UTC m=+0.205290640 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public) Oct 14 06:17:45 localhost podman[333856]: 2025-10-14 10:17:45.915018274 +0000 UTC m=+0.243293637 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, config_id=edpm, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container) Oct 14 06:17:45 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:17:46 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:46.499 2 INFO neutron.agent.securitygroups_rpc [None req-4edfd190-8677-4ed2-97b2-53949f840179 4abcf2207306448e9582b15f96b7ebff 3ea6a4a53034479f90ec8161c8b6ce29 - - default default] Security group member updated ['f8556b9e-ea71-4aa8-9e6b-de955a348819']#033[00m Oct 14 06:17:46 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:46.568 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:46 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:46.883 2 INFO neutron.agent.securitygroups_rpc [None req-75011e6b-f866-4f68-81e1-8496b330b115 4abcf2207306448e9582b15f96b7ebff 3ea6a4a53034479f90ec8161c8b6ce29 - - default default] Security group member updated ['f8556b9e-ea71-4aa8-9e6b-de955a348819']#033[00m Oct 14 06:17:46 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e158 e158: 6 total, 6 up, 6 in Oct 14 06:17:48 localhost nova_compute[297686]: 2025-10-14 10:17:48.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:17:49 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/345011905' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:17:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:17:49 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/345011905' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:17:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:50 localhost nova_compute[297686]: 2025-10-14 10:17:50.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:54.138 271987 INFO neutron.agent.linux.ip_lib [None req-87c1570a-7872-45ae-afcc-05bc39b2380a - - - - - -] Device tap3d0c5b2b-50 cannot be used as it has no MAC address#033[00m Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost kernel: device tap3d0c5b2b-50 entered promiscuous mode Oct 14 06:17:54 localhost NetworkManager[5977]: [1760437074.1703] manager: (tap3d0c5b2b-50): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Oct 14 06:17:54 localhost ovn_controller[157396]: 2025-10-14T10:17:54Z|00222|binding|INFO|Claiming lport 3d0c5b2b-505b-4bb8-b578-bc082710b7de for this chassis. Oct 14 06:17:54 localhost ovn_controller[157396]: 2025-10-14T10:17:54Z|00223|binding|INFO|3d0c5b2b-505b-4bb8-b578-bc082710b7de: Claiming unknown Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost systemd-udevd[333931]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:17:54 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:54.182 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ea6a4a53034479f90ec8161c8b6ce29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64c232ac-673e-49c9-bfcb-164d769b5098, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3d0c5b2b-505b-4bb8-b578-bc082710b7de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:54 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:54.184 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3d0c5b2b-505b-4bb8-b578-bc082710b7de in datapath 6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3 bound to our chassis#033[00m Oct 14 06:17:54 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:54.187 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:17:54 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:54.188 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[9c126979-c79c-4b6f-a022-02c10219a1b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:54 localhost ovn_controller[157396]: 2025-10-14T10:17:54Z|00224|binding|INFO|Setting lport 3d0c5b2b-505b-4bb8-b578-bc082710b7de ovn-installed in OVS Oct 14 06:17:54 localhost ovn_controller[157396]: 2025-10-14T10:17:54Z|00225|binding|INFO|Setting lport 3d0c5b2b-505b-4bb8-b578-bc082710b7de up in Southbound Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost journal[237477]: ethtool ioctl error on tap3d0c5b2b-50: No such device Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost nova_compute[297686]: 2025-10-14 10:17:54.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:17:55 localhost podman[334002]: Oct 14 06:17:55 localhost podman[334002]: 2025-10-14 10:17:55.151376129 +0000 UTC m=+0.096798478 container create 977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:55 localhost systemd[1]: Started libpod-conmon-977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3.scope. Oct 14 06:17:55 localhost podman[334002]: 2025-10-14 10:17:55.107730268 +0000 UTC m=+0.053152687 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:17:55 localhost systemd[1]: Started libcrun container. Oct 14 06:17:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e659abb4e915bc5df7ed410f2c71d353239cd91c2fc558775eb95e1870f0cc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:17:55 localhost podman[334002]: 2025-10-14 10:17:55.232359698 +0000 UTC m=+0.177782047 container init 977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Oct 14 06:17:55 localhost podman[334002]: 2025-10-14 10:17:55.242581785 +0000 UTC m=+0.188004124 container start 977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Oct 14 06:17:55 localhost dnsmasq[334021]: started, version 2.85 cachesize 150 Oct 14 06:17:55 localhost dnsmasq[334021]: DNS service limited to local subnets Oct 14 06:17:55 localhost dnsmasq[334021]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:17:55 localhost dnsmasq[334021]: warning: no upstream servers configured Oct 14 06:17:55 localhost dnsmasq-dhcp[334021]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:17:55 localhost dnsmasq[334021]: read /var/lib/neutron/dhcp/6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3/addn_hosts - 0 addresses Oct 14 06:17:55 localhost dnsmasq-dhcp[334021]: read /var/lib/neutron/dhcp/6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3/host Oct 14 06:17:55 localhost dnsmasq-dhcp[334021]: read /var/lib/neutron/dhcp/6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3/opts Oct 14 06:17:55 localhost nova_compute[297686]: 2025-10-14 10:17:55.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:55 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:55.397 271987 INFO neutron.agent.dhcp.agent [None req-9ea20c8d-e1e3-4e1d-924c-66789c4336da - - - - - -] DHCP configuration for ports {'2bc241ba-7e22-4894-8fa4-9215ba0cceba'} is completed#033[00m Oct 14 06:17:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:17:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:17:56 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:56.681 271987 INFO neutron.agent.linux.ip_lib [None req-3dd21daa-ff8e-49bb-a76b-e05315ba0de0 - - - - - -] Device tapcc4f5a4e-f4 cannot be used as it has no MAC address#033[00m Oct 14 06:17:56 localhost podman[334024]: 2025-10-14 10:17:56.707720944 +0000 UTC m=+0.130791392 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:17:56 localhost podman[334024]: 2025-10-14 10:17:56.715429743 +0000 UTC m=+0.138500191 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:17:56 localhost nova_compute[297686]: 2025-10-14 10:17:56.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:56 localhost kernel: device tapcc4f5a4e-f4 entered promiscuous mode Oct 14 06:17:56 localhost NetworkManager[5977]: [1760437076.7284] manager: (tapcc4f5a4e-f4): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Oct 14 06:17:56 localhost ovn_controller[157396]: 2025-10-14T10:17:56Z|00226|binding|INFO|Claiming lport cc4f5a4e-f42a-454a-a1f3-afee9341cc14 for this chassis. Oct 14 06:17:56 localhost ovn_controller[157396]: 2025-10-14T10:17:56Z|00227|binding|INFO|cc4f5a4e-f42a-454a-a1f3-afee9341cc14: Claiming unknown Oct 14 06:17:56 localhost nova_compute[297686]: 2025-10-14 10:17:56.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:56 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:17:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:56.740 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-8693f660-bfcd-4c6f-b962-a77807ed6dd7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8693f660-bfcd-4c6f-b962-a77807ed6dd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b8394de28c74b2e99420d1b07ba3637', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc86b47d-d1c1-44b9-8e54-1b030e53219f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc4f5a4e-f42a-454a-a1f3-afee9341cc14) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:56.741 163055 INFO neutron.agent.ovn.metadata.agent [-] Port cc4f5a4e-f42a-454a-a1f3-afee9341cc14 in datapath 8693f660-bfcd-4c6f-b962-a77807ed6dd7 bound to our chassis#033[00m Oct 14 06:17:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:56.743 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8693f660-bfcd-4c6f-b962-a77807ed6dd7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:17:56 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:56.745 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[b256a3d1-d23f-4a53-b83c-d83affc446ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost nova_compute[297686]: 2025-10-14 10:17:56.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:56 localhost ovn_controller[157396]: 2025-10-14T10:17:56Z|00228|binding|INFO|Setting lport cc4f5a4e-f42a-454a-a1f3-afee9341cc14 ovn-installed in OVS Oct 14 06:17:56 localhost ovn_controller[157396]: 2025-10-14T10:17:56Z|00229|binding|INFO|Setting lport cc4f5a4e-f42a-454a-a1f3-afee9341cc14 up in Southbound Oct 14 06:17:56 localhost nova_compute[297686]: 2025-10-14 10:17:56.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost journal[237477]: ethtool ioctl error on tapcc4f5a4e-f4: No such device Oct 14 06:17:56 localhost podman[334025]: 2025-10-14 10:17:56.799043263 +0000 UTC m=+0.216998003 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:17:56 localhost neutron_sriov_agent[264974]: 2025-10-14 10:17:56.798 2 INFO neutron.agent.securitygroups_rpc [None req-e740ef6c-0d76-4410-beb6-5f76db4534e4 72fde6d55cf34982a256eb50b9f6d56d 6b8394de28c74b2e99420d1b07ba3637 - - default default] Security group member updated ['c032904b-0f74-49ea-92f0-78e8713215a7']#033[00m Oct 14 06:17:56 localhost nova_compute[297686]: 2025-10-14 10:17:56.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:56 localhost nova_compute[297686]: 2025-10-14 10:17:56.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:56 localhost podman[334025]: 2025-10-14 10:17:56.832954783 +0000 UTC m=+0.250909533 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:17:56 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:17:57 localhost podman[334139]: Oct 14 06:17:57 localhost podman[334139]: 2025-10-14 10:17:57.679473192 +0000 UTC m=+0.091340240 container create 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:17:57 localhost systemd[1]: Started libpod-conmon-115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a.scope. Oct 14 06:17:57 localhost systemd[1]: Started libcrun container. Oct 14 06:17:57 localhost podman[334139]: 2025-10-14 10:17:57.637929275 +0000 UTC m=+0.049796373 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:17:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/868a34a18bf647a9d7ade25aa7e1d12e742bd535611d4f27ee4885fa47b65a70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:17:57 localhost podman[334139]: 2025-10-14 10:17:57.746423806 +0000 UTC m=+0.158290874 container init 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:17:57 localhost podman[334139]: 2025-10-14 10:17:57.755323891 +0000 UTC m=+0.167190959 container start 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 06:17:57 localhost dnsmasq[334158]: started, version 2.85 cachesize 150 Oct 14 06:17:57 localhost dnsmasq[334158]: DNS service limited to local subnets Oct 14 06:17:57 localhost dnsmasq[334158]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:17:57 localhost dnsmasq[334158]: warning: no upstream servers configured Oct 14 06:17:57 localhost dnsmasq-dhcp[334158]: DHCPv6, static leases only on 2001:db8::, lease time 1d Oct 14 06:17:57 localhost dnsmasq[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/addn_hosts - 0 addresses Oct 14 06:17:57 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/host Oct 14 06:17:57 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/opts Oct 14 06:17:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:57.786 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:17:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:57.786 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:17:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:57.787 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:17:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:57.806 271987 INFO neutron.agent.dhcp.agent [None req-3dd21daa-ff8e-49bb-a76b-e05315ba0de0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:17:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4d17818e-0d8d-48d5-a71f-9b372f4b98ae, ip_allocation=immediate, mac_address=fa:16:3e:45:71:b0, name=tempest-RoutersIpV6Test-1611212768, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:17:54Z, description=, dns_domain=, id=8693f660-bfcd-4c6f-b962-a77807ed6dd7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-795572567, port_security_enabled=True, project_id=6b8394de28c74b2e99420d1b07ba3637, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1355, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2496, status=ACTIVE, subnets=['b7df9130-9195-4a1b-9e62-9dd8dae985c9'], tags=[], tenant_id=6b8394de28c74b2e99420d1b07ba3637, updated_at=2025-10-14T10:17:55Z, vlan_transparent=None, network_id=8693f660-bfcd-4c6f-b962-a77807ed6dd7, port_security_enabled=True, project_id=6b8394de28c74b2e99420d1b07ba3637, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c032904b-0f74-49ea-92f0-78e8713215a7'], standard_attr_id=2505, status=DOWN, tags=[], tenant_id=6b8394de28c74b2e99420d1b07ba3637, updated_at=2025-10-14T10:17:56Z on network 8693f660-bfcd-4c6f-b962-a77807ed6dd7#033[00m Oct 14 06:17:57 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:57.874 271987 INFO neutron.agent.dhcp.agent [None req-9afa3f1b-8686-4cea-92f6-00c7ccd18948 - - - - - -] DHCP configuration for ports {'0b74f6cd-829d-4014-9c16-6f4e476ec104'} is completed#033[00m Oct 14 06:17:57 localhost dnsmasq[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/addn_hosts - 1 addresses Oct 14 06:17:57 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/host Oct 14 06:17:57 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/opts Oct 14 06:17:57 localhost podman[334177]: 2025-10-14 10:17:57.990674721 +0000 UTC m=+0.056821061 container kill 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:58 localhost dnsmasq[334021]: exiting on receipt of SIGTERM Oct 14 06:17:58 localhost podman[334214]: 2025-10-14 10:17:58.197411524 +0000 UTC m=+0.063900860 container kill 977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:17:58 localhost systemd[1]: libpod-977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3.scope: Deactivated successfully. Oct 14 06:17:58 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:58.221 271987 INFO neutron.agent.dhcp.agent [None req-7f7e3cfc-cfac-4216-a8b3-0aa2e509f7f4 - - - - - -] DHCP configuration for ports {'4d17818e-0d8d-48d5-a71f-9b372f4b98ae'} is completed#033[00m Oct 14 06:17:58 localhost podman[334230]: 2025-10-14 10:17:58.262888162 +0000 UTC m=+0.043989014 container died 977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:17:58 localhost podman[334230]: 2025-10-14 10:17:58.30606621 +0000 UTC m=+0.087167022 container remove 977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS) Oct 14 06:17:58 localhost podman[248187]: time="2025-10-14T10:17:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:17:58 localhost nova_compute[297686]: 2025-10-14 10:17:58.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:58 localhost ovn_controller[157396]: 2025-10-14T10:17:58Z|00230|binding|INFO|Releasing lport 3d0c5b2b-505b-4bb8-b578-bc082710b7de from this chassis (sb_readonly=0) Oct 14 06:17:58 localhost ovn_controller[157396]: 2025-10-14T10:17:58Z|00231|binding|INFO|Setting lport 3d0c5b2b-505b-4bb8-b578-bc082710b7de down in Southbound Oct 14 06:17:58 localhost podman[248187]: @ - - [14/Oct/2025:10:17:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149314 "" "Go-http-client/1.1" Oct 14 06:17:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:58.333 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ea6a4a53034479f90ec8161c8b6ce29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64c232ac-673e-49c9-bfcb-164d769b5098, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3d0c5b2b-505b-4bb8-b578-bc082710b7de) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:17:58 localhost kernel: device tap3d0c5b2b-50 left promiscuous mode Oct 14 06:17:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:58.336 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3d0c5b2b-505b-4bb8-b578-bc082710b7de in datapath 6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3 unbound from our chassis#033[00m Oct 14 06:17:58 localhost nova_compute[297686]: 2025-10-14 10:17:58.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:58.341 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d9021b1-3bc9-49bd-b9ec-a47d4f16dce3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:17:58 localhost ovn_metadata_agent[163050]: 2025-10-14 10:17:58.342 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c3956a-e787-4632-84ff-c1b8560aadfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:17:58 localhost nova_compute[297686]: 2025-10-14 10:17:58.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:58 localhost systemd[1]: libpod-conmon-977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3.scope: Deactivated successfully. Oct 14 06:17:58 localhost podman[248187]: @ - - [14/Oct/2025:10:17:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20343 "" "Go-http-client/1.1" Oct 14 06:17:58 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:17:58.581 271987 INFO neutron.agent.dhcp.agent [None req-4763be8e-1b63-44f4-962c-5667ead06495 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:17:58 localhost systemd[1]: var-lib-containers-storage-overlay-0e659abb4e915bc5df7ed410f2c71d353239cd91c2fc558775eb95e1870f0cc3-merged.mount: Deactivated successfully. Oct 14 06:17:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-977b88429e73197cb9b0e37fd83e934bd2fa5ae7f56ddc604a99ecc490cdf9a3-userdata-shm.mount: Deactivated successfully. Oct 14 06:17:58 localhost systemd[1]: run-netns-qdhcp\x2d6d9021b1\x2d3bc9\x2d49bd\x2db9ec\x2da47d4f16dce3.mount: Deactivated successfully. Oct 14 06:17:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e159 e159: 6 total, 6 up, 6 in Oct 14 06:17:59 localhost nova_compute[297686]: 2025-10-14 10:17:59.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:17:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:00.157 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:17:56Z, description=, device_id=88f25e70-c3a9-4d55-8027-32197f74506b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4d17818e-0d8d-48d5-a71f-9b372f4b98ae, ip_allocation=immediate, mac_address=fa:16:3e:45:71:b0, name=tempest-RoutersIpV6Test-1611212768, network_id=8693f660-bfcd-4c6f-b962-a77807ed6dd7, port_security_enabled=True, project_id=6b8394de28c74b2e99420d1b07ba3637, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['c032904b-0f74-49ea-92f0-78e8713215a7'], standard_attr_id=2505, status=ACTIVE, tags=[], tenant_id=6b8394de28c74b2e99420d1b07ba3637, updated_at=2025-10-14T10:17:58Z on network 8693f660-bfcd-4c6f-b962-a77807ed6dd7#033[00m Oct 14 06:18:00 localhost nova_compute[297686]: 2025-10-14 10:18:00.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:00 localhost podman[334274]: 2025-10-14 10:18:00.379943254 +0000 UTC m=+0.061554058 container kill 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:18:00 localhost dnsmasq[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/addn_hosts - 1 addresses Oct 14 06:18:00 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/host Oct 14 06:18:00 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/opts Oct 14 06:18:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:00.509 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:00 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:00.580 271987 INFO neutron.agent.dhcp.agent [None req-cddc4b59-738f-432d-829e-2540de2ffb48 - - - - - -] DHCP configuration for ports {'4d17818e-0d8d-48d5-a71f-9b372f4b98ae'} is completed#033[00m Oct 14 06:18:00 localhost ovn_controller[157396]: 2025-10-14T10:18:00Z|00232|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:00 localhost nova_compute[297686]: 2025-10-14 10:18:00.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:01 localhost neutron_sriov_agent[264974]: 2025-10-14 10:18:01.378 2 INFO neutron.agent.securitygroups_rpc [None req-b53395c4-e365-498d-b8c4-80ffc4a91847 72fde6d55cf34982a256eb50b9f6d56d 6b8394de28c74b2e99420d1b07ba3637 - - default default] Security group member updated ['c032904b-0f74-49ea-92f0-78e8713215a7']#033[00m Oct 14 06:18:01 localhost podman[334314]: 2025-10-14 10:18:01.553136802 +0000 UTC m=+0.054390717 container kill 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:18:01 localhost dnsmasq[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/addn_hosts - 0 addresses Oct 14 06:18:01 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/host Oct 14 06:18:01 localhost dnsmasq-dhcp[334158]: read /var/lib/neutron/dhcp/8693f660-bfcd-4c6f-b962-a77807ed6dd7/opts Oct 14 06:18:01 localhost nova_compute[297686]: 2025-10-14 10:18:01.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:01 localhost kernel: device tapcc4f5a4e-f4 left promiscuous mode Oct 14 06:18:01 localhost ovn_controller[157396]: 2025-10-14T10:18:01Z|00233|binding|INFO|Releasing lport cc4f5a4e-f42a-454a-a1f3-afee9341cc14 from this chassis (sb_readonly=0) Oct 14 06:18:01 localhost ovn_controller[157396]: 2025-10-14T10:18:01Z|00234|binding|INFO|Setting lport cc4f5a4e-f42a-454a-a1f3-afee9341cc14 down in Southbound Oct 14 06:18:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:01.762 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-8693f660-bfcd-4c6f-b962-a77807ed6dd7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8693f660-bfcd-4c6f-b962-a77807ed6dd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b8394de28c74b2e99420d1b07ba3637', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc86b47d-d1c1-44b9-8e54-1b030e53219f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc4f5a4e-f42a-454a-a1f3-afee9341cc14) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:01.764 163055 INFO neutron.agent.ovn.metadata.agent [-] Port cc4f5a4e-f42a-454a-a1f3-afee9341cc14 in datapath 8693f660-bfcd-4c6f-b962-a77807ed6dd7 unbound from our chassis#033[00m Oct 14 06:18:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:01.766 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8693f660-bfcd-4c6f-b962-a77807ed6dd7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:18:01 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:01.767 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c248e332-e82a-4e9e-b844-8cf39865b25c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:01 localhost nova_compute[297686]: 2025-10-14 10:18:01.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:01.779 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e160 e160: 6 total, 6 up, 6 in Oct 14 06:18:03 localhost dnsmasq[334158]: exiting on receipt of SIGTERM Oct 14 06:18:03 localhost systemd[1]: libpod-115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a.scope: Deactivated successfully. Oct 14 06:18:03 localhost podman[334354]: 2025-10-14 10:18:03.064649987 +0000 UTC m=+0.050449404 container kill 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:03 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e161 e161: 6 total, 6 up, 6 in Oct 14 06:18:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:18:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:18:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:18:03 localhost podman[334375]: 2025-10-14 10:18:03.16998651 +0000 UTC m=+0.068728781 container died 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Oct 14 06:18:03 localhost systemd[1]: tmp-crun.VuKsed.mount: Deactivated successfully. Oct 14 06:18:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a-userdata-shm.mount: Deactivated successfully. Oct 14 06:18:03 localhost podman[334374]: 2025-10-14 10:18:03.251942858 +0000 UTC m=+0.155814418 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:18:03 localhost podman[334374]: 2025-10-14 10:18:03.264079723 +0000 UTC m=+0.167951263 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:18:03 localhost podman[334375]: 2025-10-14 10:18:03.293804714 +0000 UTC m=+0.192546975 container remove 115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8693f660-bfcd-4c6f-b962-a77807ed6dd7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:18:03 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:18:03 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:03.345 271987 INFO neutron.agent.dhcp.agent [None req-4ba0ead9-5fac-4d82-a23e-5f39b185a34b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:03 localhost podman[334377]: 2025-10-14 10:18:03.352840323 +0000 UTC m=+0.241714988 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:18:03 localhost podman[334377]: 2025-10-14 10:18:03.362935096 +0000 UTC m=+0.251809691 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true) Oct 14 06:18:03 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:18:03 localhost systemd[1]: libpod-conmon-115876dc02824948f282d79ecb14a77c8b9e6a5f7d092f79257e2812f072f00a.scope: Deactivated successfully. Oct 14 06:18:03 localhost podman[334376]: 2025-10-14 10:18:03.453776539 +0000 UTC m=+0.352543080 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:18:03 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:03.463 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:03 localhost podman[334376]: 2025-10-14 10:18:03.466019398 +0000 UTC m=+0.364785939 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:18:03 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:18:03 localhost ovn_controller[157396]: 2025-10-14T10:18:03Z|00235|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:03 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Oct 14 06:18:03 localhost nova_compute[297686]: 2025-10-14 10:18:03.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:04 localhost systemd[1]: var-lib-containers-storage-overlay-868a34a18bf647a9d7ade25aa7e1d12e742bd535611d4f27ee4885fa47b65a70-merged.mount: Deactivated successfully. Oct 14 06:18:04 localhost systemd[1]: run-netns-qdhcp\x2d8693f660\x2dbfcd\x2d4c6f\x2db962\x2da77807ed6dd7.mount: Deactivated successfully. Oct 14 06:18:04 localhost nova_compute[297686]: 2025-10-14 10:18:04.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e162 e162: 6 total, 6 up, 6 in Oct 14 06:18:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:05 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e163 e163: 6 total, 6 up, 6 in Oct 14 06:18:05 localhost nova_compute[297686]: 2025-10-14 10:18:05.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:06.808 271987 INFO neutron.agent.linux.ip_lib [None req-706dc263-5342-4069-987e-3a21e5eec18e - - - - - -] Device tap17aa81f4-8f cannot be used as it has no MAC address#033[00m Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost kernel: device tap17aa81f4-8f entered promiscuous mode Oct 14 06:18:06 localhost NetworkManager[5977]: [1760437086.8403] manager: (tap17aa81f4-8f): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Oct 14 06:18:06 localhost ovn_controller[157396]: 2025-10-14T10:18:06Z|00236|binding|INFO|Claiming lport 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b for this chassis. Oct 14 06:18:06 localhost ovn_controller[157396]: 2025-10-14T10:18:06Z|00237|binding|INFO|17aa81f4-8f6f-4da3-a21b-e02dd7e1660b: Claiming unknown Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost systemd-udevd[334465]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:18:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:06.852 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-c77058f6-072f-458e-97c1-048312622bcb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77058f6-072f-458e-97c1-048312622bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a04ee7af-52c4-47bb-887e-c1b26c4f4bac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=17aa81f4-8f6f-4da3-a21b-e02dd7e1660b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:06.854 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b in datapath c77058f6-072f-458e-97c1-048312622bcb bound to our chassis#033[00m Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:06.856 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c77058f6-072f-458e-97c1-048312622bcb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:18:06 localhost ovn_controller[157396]: 2025-10-14T10:18:06Z|00238|binding|INFO|Setting lport 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b ovn-installed in OVS Oct 14 06:18:06 localhost ovn_controller[157396]: 2025-10-14T10:18:06Z|00239|binding|INFO|Setting lport 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b up in Southbound Oct 14 06:18:06 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:06.857 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[470995fe-42ae-449d-8f25-9051b4929de3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:06 localhost nova_compute[297686]: 2025-10-14 10:18:06.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e164 e164: 6 total, 6 up, 6 in Oct 14 06:18:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Oct 14 06:18:07 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/984465380' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Oct 14 06:18:07 localhost podman[334518]: Oct 14 06:18:07 localhost podman[334518]: 2025-10-14 10:18:07.810722486 +0000 UTC m=+0.068744600 container create 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Oct 14 06:18:07 localhost systemd[1]: Started libpod-conmon-39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48.scope. Oct 14 06:18:07 localhost systemd[1]: Started libcrun container. Oct 14 06:18:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42fe08056e04ac912b42d03dbc2b05e0b93e66ffa133f99c0b8fe054ce467246/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:18:07 localhost podman[334518]: 2025-10-14 10:18:07.878367201 +0000 UTC m=+0.136389315 container init 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:18:07 localhost podman[334518]: 2025-10-14 10:18:07.784655279 +0000 UTC m=+0.042677413 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:18:07 localhost podman[334518]: 2025-10-14 10:18:07.887901387 +0000 UTC m=+0.145923501 container start 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 06:18:07 localhost dnsmasq[334536]: started, version 2.85 cachesize 150 Oct 14 06:18:07 localhost dnsmasq[334536]: DNS service limited to local subnets Oct 14 06:18:07 localhost dnsmasq[334536]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:18:07 localhost dnsmasq[334536]: warning: no upstream servers configured Oct 14 06:18:07 localhost dnsmasq-dhcp[334536]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:18:07 localhost dnsmasq[334536]: read /var/lib/neutron/dhcp/c77058f6-072f-458e-97c1-048312622bcb/addn_hosts - 0 addresses Oct 14 06:18:07 localhost dnsmasq-dhcp[334536]: read /var/lib/neutron/dhcp/c77058f6-072f-458e-97c1-048312622bcb/host Oct 14 06:18:07 localhost dnsmasq-dhcp[334536]: read /var/lib/neutron/dhcp/c77058f6-072f-458e-97c1-048312622bcb/opts Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.086 271987 INFO neutron.agent.dhcp.agent [None req-99caaade-a049-425f-9e8f-b576616e9c05 - - - - - -] DHCP configuration for ports {'83334cfd-1d55-4d7f-878e-55d4c0092e44'} is completed#033[00m Oct 14 06:18:08 localhost nova_compute[297686]: 2025-10-14 10:18:08.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:08 localhost kernel: device tap17aa81f4-8f left promiscuous mode Oct 14 06:18:08 localhost ovn_controller[157396]: 2025-10-14T10:18:08Z|00240|binding|INFO|Releasing lport 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b from this chassis (sb_readonly=0) Oct 14 06:18:08 localhost ovn_controller[157396]: 2025-10-14T10:18:08Z|00241|binding|INFO|Setting lport 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b down in Southbound Oct 14 06:18:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:08.197 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d54f4b13-8c08-4fd3-9e38-46e0e41cc62e with type ""#033[00m Oct 14 06:18:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:08.199 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-c77058f6-072f-458e-97c1-048312622bcb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77058f6-072f-458e-97c1-048312622bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a04ee7af-52c4-47bb-887e-c1b26c4f4bac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=17aa81f4-8f6f-4da3-a21b-e02dd7e1660b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:08.201 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 17aa81f4-8f6f-4da3-a21b-e02dd7e1660b in datapath c77058f6-072f-458e-97c1-048312622bcb unbound from our chassis#033[00m Oct 14 06:18:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:08.204 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c77058f6-072f-458e-97c1-048312622bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:18:08 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:08.205 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[c791150f-3399-4047-9751-79a5b0114e14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:08 localhost nova_compute[297686]: 2025-10-14 10:18:08.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e165 e165: 6 total, 6 up, 6 in Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.338032) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437088338276, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2613, "num_deletes": 260, "total_data_size": 4816909, "memory_usage": 5005792, "flush_reason": "Manual Compaction"} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437088352118, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 3151582, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21863, "largest_seqno": 24471, "table_properties": {"data_size": 3141600, "index_size": 6359, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 22130, "raw_average_key_size": 21, "raw_value_size": 3121100, "raw_average_value_size": 3059, "num_data_blocks": 269, "num_entries": 1020, "num_filter_entries": 1020, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436937, "oldest_key_time": 1760436937, "file_creation_time": 1760437088, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 14135 microseconds, and 7153 cpu microseconds. Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.352174) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 3151582 bytes OK Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.352199) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.354514) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.354534) EVENT_LOG_v1 {"time_micros": 1760437088354528, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.354556) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 4805138, prev total WAL file size 4805138, number of live WAL files 2. Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.355638) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(3077KB)], [33(14MB)] Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437088355743, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 18258016, "oldest_snapshot_seqno": -1} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 13034 keys, 17093690 bytes, temperature: kUnknown Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437088445933, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 17093690, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17020211, "index_size": 39778, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32645, "raw_key_size": 350969, "raw_average_key_size": 26, "raw_value_size": 16799010, "raw_average_value_size": 1288, "num_data_blocks": 1486, "num_entries": 13034, "num_filter_entries": 13034, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437088, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.446341) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 17093690 bytes Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.448092) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.1 rd, 189.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 14.4 +0.0 blob) out(16.3 +0.0 blob), read-write-amplify(11.2) write-amplify(5.4) OK, records in: 13576, records dropped: 542 output_compression: NoCompression Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.448122) EVENT_LOG_v1 {"time_micros": 1760437088448110, "job": 18, "event": "compaction_finished", "compaction_time_micros": 90328, "compaction_time_cpu_micros": 48848, "output_level": 6, "num_output_files": 1, "total_output_size": 17093690, "num_input_records": 13576, "num_output_records": 13034, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437088448640, "job": 18, "event": "table_file_deletion", "file_number": 35} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437088451165, "job": 18, "event": "table_file_deletion", "file_number": 33} Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.355568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.451227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.451234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.451238) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.451241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:08 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:08.451243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:08 localhost openstack_network_exporter[250374]: ERROR 10:18:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:18:08 localhost openstack_network_exporter[250374]: ERROR 10:18:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:18:08 localhost openstack_network_exporter[250374]: ERROR 10:18:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:18:08 localhost openstack_network_exporter[250374]: Oct 14 06:18:08 localhost openstack_network_exporter[250374]: ERROR 10:18:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:18:08 localhost openstack_network_exporter[250374]: Oct 14 06:18:08 localhost openstack_network_exporter[250374]: ERROR 10:18:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:18:08 localhost dnsmasq[334536]: read /var/lib/neutron/dhcp/c77058f6-072f-458e-97c1-048312622bcb/addn_hosts - 0 addresses Oct 14 06:18:08 localhost dnsmasq-dhcp[334536]: read /var/lib/neutron/dhcp/c77058f6-072f-458e-97c1-048312622bcb/host Oct 14 06:18:08 localhost dnsmasq-dhcp[334536]: read /var/lib/neutron/dhcp/c77058f6-072f-458e-97c1-048312622bcb/opts Oct 14 06:18:08 localhost podman[334554]: 2025-10-14 10:18:08.855792495 +0000 UTC m=+0.040907787 container kill 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:18:08 localhost systemd[1]: tmp-crun.wUqlam.mount: Deactivated successfully. Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent [None req-eae8345b-a49f-4d4a-8c78-cb9ee31c3bb3 - - - - - -] Unable to reload_allocations dhcp for c77058f6-072f-458e-97c1-048312622bcb.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap17aa81f4-8f not found in namespace qdhcp-c77058f6-072f-458e-97c1-048312622bcb. Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent return fut.result() Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent return self.__get_result() Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent raise self._exception Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap17aa81f4-8f not found in namespace qdhcp-c77058f6-072f-458e-97c1-048312622bcb. Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.880 271987 ERROR neutron.agent.dhcp.agent #033[00m Oct 14 06:18:08 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:08.882 271987 INFO neutron.agent.dhcp.agent [None req-a330ed12-4c69-43ca-bb15-7f3378a0c737 - - - - - -] Synchronizing state#033[00m Oct 14 06:18:08 localhost ovn_controller[157396]: 2025-10-14T10:18:08Z|00242|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:08 localhost nova_compute[297686]: 2025-10-14 10:18:08.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:09 localhost nova_compute[297686]: 2025-10-14 10:18:09.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:09.155 271987 INFO neutron.agent.dhcp.agent [None req-54337c0a-4c7c-4346-8011-05a0439551a6 - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:18:09 localhost podman[334584]: 2025-10-14 10:18:09.303790501 +0000 UTC m=+0.065799439 container kill 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:18:09 localhost dnsmasq[334536]: exiting on receipt of SIGTERM Oct 14 06:18:09 localhost systemd[1]: libpod-39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48.scope: Deactivated successfully. Oct 14 06:18:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e166 e166: 6 total, 6 up, 6 in Oct 14 06:18:09 localhost podman[334600]: 2025-10-14 10:18:09.377915277 +0000 UTC m=+0.053918331 container died 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:18:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:09 localhost podman[334600]: 2025-10-14 10:18:09.431092074 +0000 UTC m=+0.107095078 container remove 39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c77058f6-072f-458e-97c1-048312622bcb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 06:18:09 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:09.465 271987 INFO neutron.agent.dhcp.agent [None req-14bc63ee-10da-41c7-bcc2-438264b13d3e - - - - - -] Synchronizing state complete#033[00m Oct 14 06:18:09 localhost systemd[1]: libpod-conmon-39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48.scope: Deactivated successfully. Oct 14 06:18:09 localhost systemd[1]: tmp-crun.4N5SV3.mount: Deactivated successfully. Oct 14 06:18:09 localhost systemd[1]: var-lib-containers-storage-overlay-42fe08056e04ac912b42d03dbc2b05e0b93e66ffa133f99c0b8fe054ce467246-merged.mount: Deactivated successfully. Oct 14 06:18:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39b28ecb286bda72c820d2fdd68872f5b4c0635fd2e668ba516e36aaf2d81f48-userdata-shm.mount: Deactivated successfully. Oct 14 06:18:09 localhost systemd[1]: run-netns-qdhcp\x2dc77058f6\x2d072f\x2d458e\x2d97c1\x2d048312622bcb.mount: Deactivated successfully. Oct 14 06:18:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e167 e167: 6 total, 6 up, 6 in Oct 14 06:18:10 localhost nova_compute[297686]: 2025-10-14 10:18:10.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:10 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:10.821 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:18:10Z, description=, device_id=f28fe744-82cd-4626-b944-e98d3314fee9, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d6b2b8ab-36de-461a-b476-f7067b58c7d2, ip_allocation=immediate, mac_address=fa:16:3e:70:ec:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2558, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:18:10Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:18:11 localhost podman[334643]: 2025-10-14 10:18:11.15573843 +0000 UTC m=+0.072125034 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 06:18:11 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 4 addresses Oct 14 06:18:11 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:18:11 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:18:11 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e168 e168: 6 total, 6 up, 6 in Oct 14 06:18:11 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:11.552 271987 INFO neutron.agent.dhcp.agent [None req-dedbe9f5-56dc-4004-bcc9-57c0f79151d4 - - - - - -] DHCP configuration for ports {'d6b2b8ab-36de-461a-b476-f7067b58c7d2'} is completed#033[00m Oct 14 06:18:13 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e169 e169: 6 total, 6 up, 6 in Oct 14 06:18:14 localhost nova_compute[297686]: 2025-10-14 10:18:14.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:14 localhost systemd[1]: tmp-crun.ntMyIT.mount: Deactivated successfully. Oct 14 06:18:14 localhost podman[334682]: 2025-10-14 10:18:14.400071206 +0000 UTC m=+0.062879559 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:18:14 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:18:14 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:18:14 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:18:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e170 e170: 6 total, 6 up, 6 in Oct 14 06:18:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e171 e171: 6 total, 6 up, 6 in Oct 14 06:18:15 localhost nova_compute[297686]: 2025-10-14 10:18:15.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:15 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e172 e172: 6 total, 6 up, 6 in Oct 14 06:18:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:18:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:18:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:18:16 localhost podman[334706]: 2025-10-14 10:18:16.743788707 +0000 UTC m=+0.077570874 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:18:16 localhost podman[334706]: 2025-10-14 10:18:16.752068004 +0000 UTC m=+0.085850141 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Oct 14 06:18:16 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:18:16 localhost podman[334704]: 2025-10-14 10:18:16.809349028 +0000 UTC m=+0.148261083 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible) Oct 14 06:18:16 localhost podman[334705]: 2025-10-14 10:18:16.839521052 +0000 UTC m=+0.173972839 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, name=ubi9-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:18:16 localhost podman[334704]: 2025-10-14 10:18:16.878079667 +0000 UTC m=+0.216991662 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:18:16 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:18:16 localhost podman[334705]: 2025-10-14 10:18:16.929329284 +0000 UTC m=+0.263781081 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.) Oct 14 06:18:16 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:18:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:17.727 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:17 localhost nova_compute[297686]: 2025-10-14 10:18:17.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:17.729 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:18:17 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:17.730 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:18:18 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:18.082 271987 INFO neutron.agent.linux.ip_lib [None req-58f3a08b-55c9-48ef-a871-38a57f48b55c - - - - - -] Device tap3ab19c1a-ab cannot be used as it has no MAC address#033[00m Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:18 localhost kernel: device tap3ab19c1a-ab entered promiscuous mode Oct 14 06:18:18 localhost NetworkManager[5977]: [1760437098.1151] manager: (tap3ab19c1a-ab): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00243|binding|INFO|Claiming lport 3ab19c1a-ab07-40d8-8ae7-57d77d174c53 for this chassis. Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00244|binding|INFO|3ab19c1a-ab07-40d8-8ae7-57d77d174c53: Claiming unknown Oct 14 06:18:18 localhost systemd-udevd[334778]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.127 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2273b28d-0a33-44ab-9d86-9a84d474205a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2273b28d-0a33-44ab-9d86-9a84d474205a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c618862-c69f-414d-a3fd-b33e88645964, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3ab19c1a-ab07-40d8-8ae7-57d77d174c53) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.130 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3ab19c1a-ab07-40d8-8ae7-57d77d174c53 in datapath 2273b28d-0a33-44ab-9d86-9a84d474205a bound to our chassis#033[00m Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.132 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2273b28d-0a33-44ab-9d86-9a84d474205a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.133 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2e23ad-7dab-4e1a-89d8-76be53c935b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00245|binding|INFO|Setting lport 3ab19c1a-ab07-40d8-8ae7-57d77d174c53 ovn-installed in OVS Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00246|binding|INFO|Setting lport 3ab19c1a-ab07-40d8-8ae7-57d77d174c53 up in Southbound Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost journal[237477]: ethtool ioctl error on tap3ab19c1a-ab: No such device Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:18 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e173 e173: 6 total, 6 up, 6 in Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.742 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 10622303-99bf-491f-9596-cc740001f8df with type ""#033[00m Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.743 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-2273b28d-0a33-44ab-9d86-9a84d474205a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2273b28d-0a33-44ab-9d86-9a84d474205a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c618862-c69f-414d-a3fd-b33e88645964, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3ab19c1a-ab07-40d8-8ae7-57d77d174c53) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00247|binding|INFO|Removing iface tap3ab19c1a-ab ovn-installed in OVS Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00248|binding|INFO|Removing lport 3ab19c1a-ab07-40d8-8ae7-57d77d174c53 ovn-installed in OVS Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.746 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 3ab19c1a-ab07-40d8-8ae7-57d77d174c53 in datapath 2273b28d-0a33-44ab-9d86-9a84d474205a unbound from our chassis#033[00m Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.747 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2273b28d-0a33-44ab-9d86-9a84d474205a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:18:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:18.748 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[1a88936f-eaaa-474f-b508-153bb8ff3dab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:18 localhost ovn_controller[157396]: 2025-10-14T10:18:18Z|00249|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:18 localhost nova_compute[297686]: 2025-10-14 10:18:18.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:19 localhost podman[334849]: Oct 14 06:18:19 localhost podman[334849]: 2025-10-14 10:18:19.022997761 +0000 UTC m=+0.080132423 container create d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 06:18:19 localhost systemd[1]: Started libpod-conmon-d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f.scope. Oct 14 06:18:19 localhost systemd[1]: Started libcrun container. Oct 14 06:18:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06c631136e7471032fe184a90436b7a733b1a1bc9b4515b97a4214c66a89dc18/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:18:19 localhost podman[334849]: 2025-10-14 10:18:19.077431037 +0000 UTC m=+0.134565699 container init d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:18:19 localhost podman[334849]: 2025-10-14 10:18:18.983666853 +0000 UTC m=+0.040801565 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:18:19 localhost podman[334849]: 2025-10-14 10:18:19.08495089 +0000 UTC m=+0.142085572 container start d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:18:19 localhost dnsmasq[334867]: started, version 2.85 cachesize 150 Oct 14 06:18:19 localhost dnsmasq[334867]: DNS service limited to local subnets Oct 14 06:18:19 localhost dnsmasq[334867]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:18:19 localhost dnsmasq[334867]: warning: no upstream servers configured Oct 14 06:18:19 localhost dnsmasq-dhcp[334867]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:18:19 localhost dnsmasq[334867]: read /var/lib/neutron/dhcp/2273b28d-0a33-44ab-9d86-9a84d474205a/addn_hosts - 0 addresses Oct 14 06:18:19 localhost dnsmasq-dhcp[334867]: read /var/lib/neutron/dhcp/2273b28d-0a33-44ab-9d86-9a84d474205a/host Oct 14 06:18:19 localhost dnsmasq-dhcp[334867]: read /var/lib/neutron/dhcp/2273b28d-0a33-44ab-9d86-9a84d474205a/opts Oct 14 06:18:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:19.221 271987 INFO neutron.agent.dhcp.agent [None req-6cfcf1c6-e933-4717-b998-b8d434e34e1c - - - - - -] DHCP configuration for ports {'e856f4db-4050-4197-8e3b-0b1a0121c1bf'} is completed#033[00m Oct 14 06:18:19 localhost nova_compute[297686]: 2025-10-14 10:18:19.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:19 localhost dnsmasq[334867]: exiting on receipt of SIGTERM Oct 14 06:18:19 localhost podman[334884]: 2025-10-14 10:18:19.310276338 +0000 UTC m=+0.058467902 container kill d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:18:19 localhost systemd[1]: libpod-d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f.scope: Deactivated successfully. Oct 14 06:18:19 localhost podman[334897]: 2025-10-14 10:18:19.380336368 +0000 UTC m=+0.057874803 container died d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 06:18:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:19 localhost podman[334897]: 2025-10-14 10:18:19.405325732 +0000 UTC m=+0.082864147 container cleanup d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Oct 14 06:18:19 localhost systemd[1]: libpod-conmon-d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f.scope: Deactivated successfully. Oct 14 06:18:19 localhost podman[334899]: 2025-10-14 10:18:19.515575838 +0000 UTC m=+0.186522999 container remove d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2273b28d-0a33-44ab-9d86-9a84d474205a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 06:18:19 localhost kernel: device tap3ab19c1a-ab left promiscuous mode Oct 14 06:18:19 localhost nova_compute[297686]: 2025-10-14 10:18:19.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:19 localhost nova_compute[297686]: 2025-10-14 10:18:19.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:19.568 271987 INFO neutron.agent.dhcp.agent [None req-bca53892-e354-425d-acba-6a4b596952e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:19 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:19.569 271987 INFO neutron.agent.dhcp.agent [None req-bca53892-e354-425d-acba-6a4b596952e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e174 e174: 6 total, 6 up, 6 in Oct 14 06:18:20 localhost systemd[1]: tmp-crun.f4r8o4.mount: Deactivated successfully. Oct 14 06:18:20 localhost systemd[1]: var-lib-containers-storage-overlay-06c631136e7471032fe184a90436b7a733b1a1bc9b4515b97a4214c66a89dc18-merged.mount: Deactivated successfully. Oct 14 06:18:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9ac14beeab714c28d30b2d79525b15cf306d1beb41dd82e0c8f8236eec1c48f-userdata-shm.mount: Deactivated successfully. Oct 14 06:18:20 localhost systemd[1]: run-netns-qdhcp\x2d2273b28d\x2d0a33\x2d44ab\x2d9d86\x2d9a84d474205a.mount: Deactivated successfully. Oct 14 06:18:20 localhost nova_compute[297686]: 2025-10-14 10:18:20.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:24 localhost nova_compute[297686]: 2025-10-14 10:18:24.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e175 e175: 6 total, 6 up, 6 in Oct 14 06:18:25 localhost nova_compute[297686]: 2025-10-14 10:18:25.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:18:25.614 2 INFO neutron.agent.securitygroups_rpc [req-546c2793-12af-430c-80b1-7cb6124afeaa req-bf258e6c-e801-49be-9d6e-83494c9ce496 4c194ea59b244432a9ec5417b8101ebe 5ac8b4aa702a449b8bf4a8039f977fc5 - - default default] Security group member updated ['8fe43e8a-a14a-430f-ba7d-c6a0fef96a1b']#033[00m Oct 14 06:18:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:18:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:18:27 localhost podman[334927]: 2025-10-14 10:18:27.768306185 +0000 UTC m=+0.104536469 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:18:27 localhost podman[334927]: 2025-10-14 10:18:27.777346466 +0000 UTC m=+0.113576760 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:18:27 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:18:27 localhost podman[334928]: 2025-10-14 10:18:27.861495042 +0000 UTC m=+0.193153374 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:18:27 localhost podman[334928]: 2025-10-14 10:18:27.896154825 +0000 UTC m=+0.227813087 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:18:27 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:18:28 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:28.210 271987 INFO neutron.agent.linux.ip_lib [None req-8ce2a506-73af-448b-91ca-6efaeb07f21f - - - - - -] Device tap056917ad-aa cannot be used as it has no MAC address#033[00m Oct 14 06:18:28 localhost nova_compute[297686]: 2025-10-14 10:18:28.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:28 localhost kernel: device tap056917ad-aa entered promiscuous mode Oct 14 06:18:28 localhost NetworkManager[5977]: [1760437108.2432] manager: (tap056917ad-aa): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Oct 14 06:18:28 localhost systemd-udevd[334978]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:18:28 localhost nova_compute[297686]: 2025-10-14 10:18:28.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:28 localhost ovn_controller[157396]: 2025-10-14T10:18:28Z|00250|binding|INFO|Claiming lport 056917ad-aaac-48b4-94b4-d0cac0b1667f for this chassis. Oct 14 06:18:28 localhost ovn_controller[157396]: 2025-10-14T10:18:28Z|00251|binding|INFO|056917ad-aaac-48b4-94b4-d0cac0b1667f: Claiming unknown Oct 14 06:18:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:28.258 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-4d5123fe-702c-4afa-b997-e84defb3dbb9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d5123fe-702c-4afa-b997-e84defb3dbb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9b470b4-a26a-4c56-82e3-64146bbd6304, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=056917ad-aaac-48b4-94b4-d0cac0b1667f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:28.259 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 056917ad-aaac-48b4-94b4-d0cac0b1667f in datapath 4d5123fe-702c-4afa-b997-e84defb3dbb9 bound to our chassis#033[00m Oct 14 06:18:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:28.264 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 0b5096bc-72fb-41b2-8fb0-a9f4cffc8ef0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:18:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:28.264 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d5123fe-702c-4afa-b997-e84defb3dbb9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:18:28 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:28.265 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[228a9512-34fe-4b93-98c7-c9d49bc79ed7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost ovn_controller[157396]: 2025-10-14T10:18:28Z|00252|binding|INFO|Setting lport 056917ad-aaac-48b4-94b4-d0cac0b1667f ovn-installed in OVS Oct 14 06:18:28 localhost ovn_controller[157396]: 2025-10-14T10:18:28Z|00253|binding|INFO|Setting lport 056917ad-aaac-48b4-94b4-d0cac0b1667f up in Southbound Oct 14 06:18:28 localhost nova_compute[297686]: 2025-10-14 10:18:28.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost podman[248187]: time="2025-10-14T10:18:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:18:28 localhost nova_compute[297686]: 2025-10-14 10:18:28.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:28 localhost journal[237477]: ethtool ioctl error on tap056917ad-aa: No such device Oct 14 06:18:28 localhost podman[248187]: @ - - [14/Oct/2025:10:18:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:18:28 localhost nova_compute[297686]: 2025-10-14 10:18:28.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:28 localhost podman[248187]: @ - - [14/Oct/2025:10:18:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19874 "" "Go-http-client/1.1" Oct 14 06:18:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e176 e176: 6 total, 6 up, 6 in Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.328 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.328 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.329 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:18:29 localhost nova_compute[297686]: 2025-10-14 10:18:29.329 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:18:29 localhost podman[335155]: Oct 14 06:18:29 localhost podman[335155]: 2025-10-14 10:18:29.368115876 +0000 UTC m=+0.095326763 container create 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:18:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Oct 14 06:18:29 localhost systemd[1]: Started libpod-conmon-45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5.scope. Oct 14 06:18:29 localhost podman[335155]: 2025-10-14 10:18:29.336036462 +0000 UTC m=+0.063247319 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:18:29 localhost podman[335167]: 2025-10-14 10:18:29.440065404 +0000 UTC m=+0.113312060 container exec b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Oct 14 06:18:29 localhost systemd[1]: Started libcrun container. Oct 14 06:18:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2b46dd501dbc0fcc67c2ec83a944175a66dab934e8d7c692ef81aae8f2e667a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:18:29 localhost podman[335155]: 2025-10-14 10:18:29.459340541 +0000 UTC m=+0.186551418 container init 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:29 localhost podman[335155]: 2025-10-14 10:18:29.466527744 +0000 UTC m=+0.193738601 container start 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:18:29 localhost dnsmasq[335194]: started, version 2.85 cachesize 150 Oct 14 06:18:29 localhost dnsmasq[335194]: DNS service limited to local subnets Oct 14 06:18:29 localhost dnsmasq[335194]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:18:29 localhost dnsmasq[335194]: warning: no upstream servers configured Oct 14 06:18:29 localhost dnsmasq-dhcp[335194]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:18:29 localhost dnsmasq[335194]: read /var/lib/neutron/dhcp/4d5123fe-702c-4afa-b997-e84defb3dbb9/addn_hosts - 0 addresses Oct 14 06:18:29 localhost dnsmasq-dhcp[335194]: read /var/lib/neutron/dhcp/4d5123fe-702c-4afa-b997-e84defb3dbb9/host Oct 14 06:18:29 localhost dnsmasq-dhcp[335194]: read /var/lib/neutron/dhcp/4d5123fe-702c-4afa-b997-e84defb3dbb9/opts Oct 14 06:18:29 localhost podman[335167]: 2025-10-14 10:18:29.57165208 +0000 UTC m=+0.244898716 container exec_died b19a91314e045cbe5465f6abdeb6b420108fcda7860b498b01dcd4e65236d2c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-fcadf6e2-9176-5818-a8d0-37b19acf8eaf-crash-np0005486733, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, ceph=True, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Oct 14 06:18:29 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:29.731 271987 INFO neutron.agent.dhcp.agent [None req-2c193211-1c85-4156-9e3d-2946d1fdaeb9 - - - - - -] DHCP configuration for ports {'54f00441-c926-476a-b5a2-3e2bc7b67b4f'} is completed#033[00m Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.906455) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437109906500, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 660, "num_deletes": 263, "total_data_size": 633060, "memory_usage": 645416, "flush_reason": "Manual Compaction"} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437109909830, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 413188, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24476, "largest_seqno": 25131, "table_properties": {"data_size": 409968, "index_size": 1139, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7984, "raw_average_key_size": 19, "raw_value_size": 403209, "raw_average_value_size": 998, "num_data_blocks": 50, "num_entries": 404, "num_filter_entries": 404, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437089, "oldest_key_time": 1760437089, "file_creation_time": 1760437109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 3405 microseconds, and 1164 cpu microseconds. Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.909860) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 413188 bytes OK Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.909874) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.911526) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.911539) EVENT_LOG_v1 {"time_micros": 1760437109911535, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.911552) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 629317, prev total WAL file size 629317, number of live WAL files 2. Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.911952) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323634' seq:72057594037927935, type:22 .. '6C6F676D0034353137' seq:0, type:0; will stop at (end) Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(403KB)], [36(16MB)] Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437109911982, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 17506878, "oldest_snapshot_seqno": -1} Oct 14 06:18:29 localhost dnsmasq[335194]: exiting on receipt of SIGTERM Oct 14 06:18:29 localhost podman[335273]: 2025-10-14 10:18:29.950454702 +0000 UTC m=+0.037676197 container kill 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:29 localhost systemd[1]: libpod-45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5.scope: Deactivated successfully. Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12899 keys, 16869147 bytes, temperature: kUnknown Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437109976419, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 16869147, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16797195, "index_size": 38635, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32261, "raw_key_size": 349352, "raw_average_key_size": 27, "raw_value_size": 16578795, "raw_average_value_size": 1285, "num_data_blocks": 1426, "num_entries": 12899, "num_filter_entries": 12899, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.976715) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 16869147 bytes Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.978393) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 271.4 rd, 261.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 16.3 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(83.2) write-amplify(40.8) OK, records in: 13438, records dropped: 539 output_compression: NoCompression Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.978411) EVENT_LOG_v1 {"time_micros": 1760437109978401, "job": 20, "event": "compaction_finished", "compaction_time_micros": 64509, "compaction_time_cpu_micros": 25070, "output_level": 6, "num_output_files": 1, "total_output_size": 16869147, "num_input_records": 13438, "num_output_records": 12899, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437109978539, "job": 20, "event": "table_file_deletion", "file_number": 38} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437109980017, "job": 20, "event": "table_file_deletion", "file_number": 36} Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.911901) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.980048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.980052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.980060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.980062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:29 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:18:29.980063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:18:30 localhost podman[335286]: 2025-10-14 10:18:30.018351366 +0000 UTC m=+0.055110758 container died 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:18:30 localhost podman[335286]: 2025-10-14 10:18:30.048104137 +0000 UTC m=+0.084863509 container cleanup 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 06:18:30 localhost systemd[1]: libpod-conmon-45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5.scope: Deactivated successfully. Oct 14 06:18:30 localhost podman[335288]: 2025-10-14 10:18:30.097235679 +0000 UTC m=+0.125945322 container remove 45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d5123fe-702c-4afa-b997-e84defb3dbb9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:30 localhost kernel: device tap056917ad-aa left promiscuous mode Oct 14 06:18:30 localhost ovn_controller[157396]: 2025-10-14T10:18:30Z|00254|binding|INFO|Releasing lport 056917ad-aaac-48b4-94b4-d0cac0b1667f from this chassis (sb_readonly=0) Oct 14 06:18:30 localhost ovn_controller[157396]: 2025-10-14T10:18:30Z|00255|binding|INFO|Setting lport 056917ad-aaac-48b4-94b4-d0cac0b1667f down in Southbound Oct 14 06:18:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:30.132 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0b5096bc-72fb-41b2-8fb0-a9f4cffc8ef0 with type ""#033[00m Oct 14 06:18:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:30.134 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-4d5123fe-702c-4afa-b997-e84defb3dbb9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d5123fe-702c-4afa-b997-e84defb3dbb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9b470b4-a26a-4c56-82e3-64146bbd6304, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=056917ad-aaac-48b4-94b4-d0cac0b1667f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:30.135 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 056917ad-aaac-48b4-94b4-d0cac0b1667f in datapath 4d5123fe-702c-4afa-b997-e84defb3dbb9 unbound from our chassis#033[00m Oct 14 06:18:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:30.137 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4d5123fe-702c-4afa-b997-e84defb3dbb9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:18:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:30.138 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[9312e81b-bcfa-4461-a95e-ad6baa0d8c1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:30 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:30.156 271987 INFO neutron.agent.dhcp.agent [None req-f2e78a70-eca7-45a4-b85d-bcd9685db605 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:30 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:30.157 271987 INFO neutron.agent.dhcp.agent [None req-f2e78a70-eca7-45a4-b85d-bcd9685db605 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e177 e177: 6 total, 6 up, 6 in Oct 14 06:18:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:30 localhost systemd[1]: var-lib-containers-storage-overlay-f2b46dd501dbc0fcc67c2ec83a944175a66dab934e8d7c692ef81aae8f2e667a-merged.mount: Deactivated successfully. Oct 14 06:18:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45e6ac5fcb0696020766a69c1dc47aded2f1f796f4b0816b1d81a964a51b26e5-userdata-shm.mount: Deactivated successfully. Oct 14 06:18:30 localhost systemd[1]: run-netns-qdhcp\x2d4d5123fe\x2d702c\x2d4afa\x2db997\x2de84defb3dbb9.mount: Deactivated successfully. Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.613 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:18:30 localhost ovn_controller[157396]: 2025-10-14T10:18:30Z|00256|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.656 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.657 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.657 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:30 localhost nova_compute[297686]: 2025-10-14 10:18:30.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Oct 14 06:18:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Oct 14 06:18:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e178 e178: 6 total, 6 up, 6 in Oct 14 06:18:32 localhost nova_compute[297686]: 2025-10-14 10:18:32.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:32 localhost nova_compute[297686]: 2025-10-14 10:18:32.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:32 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486732.localdomain to 836.6M Oct 14 06:18:32 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486732.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:18:32 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Oct 14 06:18:32 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Oct 14 06:18:32 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486733.localdomain to 836.6M Oct 14 06:18:32 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486733.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:18:32 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Oct 14 06:18:32 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Oct 14 06:18:32 localhost ceph-mon[317114]: Adjusting osd_memory_target on np0005486731.localdomain to 836.6M Oct 14 06:18:32 localhost ceph-mon[317114]: Unable to set osd_memory_target on np0005486731.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Oct 14 06:18:32 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:18:32 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:32 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:18:32 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3170999203' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:18:32 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:18:32 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3170999203' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:18:33 localhost nova_compute[297686]: 2025-10-14 10:18:33.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:33 localhost nova_compute[297686]: 2025-10-14 10:18:33.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:18:33 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e179 e179: 6 total, 6 up, 6 in Oct 14 06:18:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:18:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:18:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:18:33 localhost podman[335445]: 2025-10-14 10:18:33.763940618 +0000 UTC m=+0.089883055 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:18:33 localhost podman[335445]: 2025-10-14 10:18:33.803256946 +0000 UTC m=+0.129199333 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:18:33 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:18:33 localhost podman[335443]: 2025-10-14 10:18:33.821453699 +0000 UTC m=+0.152784013 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:33 localhost podman[335444]: 2025-10-14 10:18:33.880435245 +0000 UTC m=+0.211155420 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:18:33 localhost podman[335444]: 2025-10-14 10:18:33.924371717 +0000 UTC m=+0.255091822 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:18:33 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:18:33 localhost podman[335443]: 2025-10-14 10:18:33.93838036 +0000 UTC m=+0.269710684 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:18:33 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:34 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:34.295 271987 INFO neutron.agent.linux.ip_lib [None req-acfac3e0-f7b9-4ed8-b2cf-fa64711ec621 - - - - - -] Device tapf66ba1ef-91 cannot be used as it has no MAC address#033[00m Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost kernel: device tapf66ba1ef-91 entered promiscuous mode Oct 14 06:18:34 localhost NetworkManager[5977]: [1760437114.3723] manager: (tapf66ba1ef-91): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost ovn_controller[157396]: 2025-10-14T10:18:34Z|00257|binding|INFO|Claiming lport f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d for this chassis. Oct 14 06:18:34 localhost ovn_controller[157396]: 2025-10-14T10:18:34Z|00258|binding|INFO|f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d: Claiming unknown Oct 14 06:18:34 localhost systemd-udevd[335515]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.390 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-6ade2f36-65d2-4259-87c6-e70580caa788', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ade2f36-65d2-4259-87c6-e70580caa788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da61b68-100d-4288-9e1c-ae899d2aa400, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.393 163055 INFO neutron.agent.ovn.metadata.agent [-] Port f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d in datapath 6ade2f36-65d2-4259-87c6-e70580caa788 bound to our chassis#033[00m Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.398 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port c0cfa5ce-9e2f-4d94-8dc1-73fa2734b8d9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.398 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ade2f36-65d2-4259-87c6-e70580caa788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.401 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[eca9c593-6bc3-4d56-a8ea-146fc1d6198e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost ovn_controller[157396]: 2025-10-14T10:18:34Z|00259|binding|INFO|Setting lport f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d ovn-installed in OVS Oct 14 06:18:34 localhost ovn_controller[157396]: 2025-10-14T10:18:34Z|00260|binding|INFO|Setting lport f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d up in Southbound Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost journal[237477]: ethtool ioctl error on tapf66ba1ef-91: No such device Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost ovn_controller[157396]: 2025-10-14T10:18:34Z|00261|binding|INFO|Removing iface tapf66ba1ef-91 ovn-installed in OVS Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.713 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c0cfa5ce-9e2f-4d94-8dc1-73fa2734b8d9 with type ""#033[00m Oct 14 06:18:34 localhost ovn_controller[157396]: 2025-10-14T10:18:34Z|00262|binding|INFO|Removing lport f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d ovn-installed in OVS Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.715 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-6ade2f36-65d2-4259-87c6-e70580caa788', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ade2f36-65d2-4259-87c6-e70580caa788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da61b68-100d-4288-9e1c-ae899d2aa400, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.717 163055 INFO neutron.agent.ovn.metadata.agent [-] Port f66ba1ef-910f-412f-b0fc-a83c2f8f9d3d in datapath 6ade2f36-65d2-4259-87c6-e70580caa788 unbound from our chassis#033[00m Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.722 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ade2f36-65d2-4259-87c6-e70580caa788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:18:34 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:34.723 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[72cef0ec-735a-4527-bf3b-36c74375500d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:34 localhost nova_compute[297686]: 2025-10-14 10:18:34.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:34 localhost systemd[1]: tmp-crun.25d85c.mount: Deactivated successfully. Oct 14 06:18:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e180 e180: 6 total, 6 up, 6 in Oct 14 06:18:35 localhost nova_compute[297686]: 2025-10-14 10:18:35.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:18:35 localhost podman[335586]: Oct 14 06:18:35 localhost podman[335586]: 2025-10-14 10:18:35.557982564 +0000 UTC m=+0.092037122 container create 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:18:35 localhost systemd[1]: Started libpod-conmon-2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3.scope. Oct 14 06:18:35 localhost systemd[1]: Started libcrun container. Oct 14 06:18:35 localhost podman[335586]: 2025-10-14 10:18:35.515470008 +0000 UTC m=+0.049524596 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:18:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffb7ef99773cf5599c3079211372e72754c804b93cb05e83bcd5fd5a1ab0b36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:18:35 localhost podman[335586]: 2025-10-14 10:18:35.630466689 +0000 UTC m=+0.164521247 container init 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009) Oct 14 06:18:35 localhost podman[335586]: 2025-10-14 10:18:35.639613542 +0000 UTC m=+0.173668100 container start 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:18:35 localhost dnsmasq[335604]: started, version 2.85 cachesize 150 Oct 14 06:18:35 localhost dnsmasq[335604]: DNS service limited to local subnets Oct 14 06:18:35 localhost dnsmasq[335604]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:18:35 localhost dnsmasq[335604]: warning: no upstream servers configured Oct 14 06:18:35 localhost dnsmasq-dhcp[335604]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:18:35 localhost dnsmasq[335604]: read /var/lib/neutron/dhcp/6ade2f36-65d2-4259-87c6-e70580caa788/addn_hosts - 0 addresses Oct 14 06:18:35 localhost dnsmasq-dhcp[335604]: read /var/lib/neutron/dhcp/6ade2f36-65d2-4259-87c6-e70580caa788/host Oct 14 06:18:35 localhost dnsmasq-dhcp[335604]: read /var/lib/neutron/dhcp/6ade2f36-65d2-4259-87c6-e70580caa788/opts Oct 14 06:18:35 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:35.806 271987 INFO neutron.agent.dhcp.agent [None req-ca40dfae-f3e2-47f5-baac-eb84e389a100 - - - - - -] DHCP configuration for ports {'1b988e7b-4758-4797-b751-0f5e7922deaf'} is completed#033[00m Oct 14 06:18:35 localhost dnsmasq[335604]: exiting on receipt of SIGTERM Oct 14 06:18:35 localhost podman[335621]: 2025-10-14 10:18:35.981967786 +0000 UTC m=+0.061146184 container kill 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:18:35 localhost systemd[1]: tmp-crun.EggnF3.mount: Deactivated successfully. Oct 14 06:18:35 localhost systemd[1]: libpod-2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3.scope: Deactivated successfully. Oct 14 06:18:35 localhost ovn_controller[157396]: 2025-10-14T10:18:35Z|00263|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:36 localhost nova_compute[297686]: 2025-10-14 10:18:36.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:36 localhost podman[335635]: 2025-10-14 10:18:36.041000405 +0000 UTC m=+0.043250501 container died 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:18:36 localhost podman[335635]: 2025-10-14 10:18:36.140732393 +0000 UTC m=+0.142982469 container cleanup 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:18:36 localhost systemd[1]: libpod-conmon-2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3.scope: Deactivated successfully. Oct 14 06:18:36 localhost podman[335636]: 2025-10-14 10:18:36.20163742 +0000 UTC m=+0.203083061 container remove 2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ade2f36-65d2-4259-87c6-e70580caa788, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:18:36 localhost nova_compute[297686]: 2025-10-14 10:18:36.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:36 localhost kernel: device tapf66ba1ef-91 left promiscuous mode Oct 14 06:18:36 localhost nova_compute[297686]: 2025-10-14 10:18:36.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:36 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:36.287 271987 INFO neutron.agent.dhcp.agent [None req-fabb0908-2bf2-4f24-b2cc-2d49ef110a36 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:36 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:36.288 271987 INFO neutron.agent.dhcp.agent [None req-fabb0908-2bf2-4f24-b2cc-2d49ef110a36 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:18:36 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e181 e181: 6 total, 6 up, 6 in Oct 14 06:18:36 localhost systemd[1]: var-lib-containers-storage-overlay-fffb7ef99773cf5599c3079211372e72754c804b93cb05e83bcd5fd5a1ab0b36-merged.mount: Deactivated successfully. Oct 14 06:18:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a1561c0d62796efb35046ee03c4b38cb9de2c46685cc05fd6d57f7f7ace70c3-userdata-shm.mount: Deactivated successfully. Oct 14 06:18:36 localhost systemd[1]: run-netns-qdhcp\x2d6ade2f36\x2d65d2\x2d4259\x2d87c6\x2de70580caa788.mount: Deactivated successfully. Oct 14 06:18:37 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:18:37 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:18:37 localhost podman[335681]: 2025-10-14 10:18:37.172471859 +0000 UTC m=+0.063690713 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:37 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.258 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.286 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.287 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.288 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.288 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.289 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:18:37 localhost ovn_controller[157396]: 2025-10-14T10:18:37Z|00264|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e182 e182: 6 total, 6 up, 6 in Oct 14 06:18:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:18:37 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/630200685' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.792 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.859 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:18:37 localhost nova_compute[297686]: 2025-10-14 10:18:37.860 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.056 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.057 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11219MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.058 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.058 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.150 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.151 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.151 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.214 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:18:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:18:38 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2226452693' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.644 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.650 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.662 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.691 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:18:38 localhost nova_compute[297686]: 2025-10-14 10:18:38.691 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:18:38 localhost openstack_network_exporter[250374]: ERROR 10:18:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:18:38 localhost openstack_network_exporter[250374]: ERROR 10:18:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:18:38 localhost openstack_network_exporter[250374]: ERROR 10:18:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:18:38 localhost openstack_network_exporter[250374]: ERROR 10:18:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:18:38 localhost openstack_network_exporter[250374]: Oct 14 06:18:38 localhost openstack_network_exporter[250374]: ERROR 10:18:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:18:38 localhost openstack_network_exporter[250374]: Oct 14 06:18:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:18:39 localhost nova_compute[297686]: 2025-10-14 10:18:39.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e183 e183: 6 total, 6 up, 6 in Oct 14 06:18:40 localhost nova_compute[297686]: 2025-10-14 10:18:40.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:42.353 271987 INFO neutron.agent.linux.ip_lib [None req-ba61c091-1015-4101-8e99-976461bf88a6 - - - - - -] Device tapd033b972-71 cannot be used as it has no MAC address#033[00m Oct 14 06:18:42 localhost nova_compute[297686]: 2025-10-14 10:18:42.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:42 localhost kernel: device tapd033b972-71 entered promiscuous mode Oct 14 06:18:42 localhost ovn_controller[157396]: 2025-10-14T10:18:42Z|00265|binding|INFO|Claiming lport d033b972-713b-4051-bed1-752e92d3ddd9 for this chassis. Oct 14 06:18:42 localhost ovn_controller[157396]: 2025-10-14T10:18:42Z|00266|binding|INFO|d033b972-713b-4051-bed1-752e92d3ddd9: Claiming unknown Oct 14 06:18:42 localhost NetworkManager[5977]: [1760437122.3742] manager: (tapd033b972-71): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Oct 14 06:18:42 localhost nova_compute[297686]: 2025-10-14 10:18:42.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:42 localhost systemd-udevd[335755]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:18:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:42.383 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-0c2db83b-8b79-46f0-a8b3-1588ce43657f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c2db83b-8b79-46f0-a8b3-1588ce43657f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=380a463c-fc4f-4a2d-afa1-2390efa644c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d033b972-713b-4051-bed1-752e92d3ddd9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:42.385 163055 INFO neutron.agent.ovn.metadata.agent [-] Port d033b972-713b-4051-bed1-752e92d3ddd9 in datapath 0c2db83b-8b79-46f0-a8b3-1588ce43657f bound to our chassis#033[00m Oct 14 06:18:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:42.386 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0c2db83b-8b79-46f0-a8b3-1588ce43657f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:18:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:42.387 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8ee7f4-a1de-4534-8ecb-3d40e1de546e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost ovn_controller[157396]: 2025-10-14T10:18:42Z|00267|binding|INFO|Setting lport d033b972-713b-4051-bed1-752e92d3ddd9 ovn-installed in OVS Oct 14 06:18:42 localhost ovn_controller[157396]: 2025-10-14T10:18:42Z|00268|binding|INFO|Setting lport d033b972-713b-4051-bed1-752e92d3ddd9 up in Southbound Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost nova_compute[297686]: 2025-10-14 10:18:42.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost journal[237477]: ethtool ioctl error on tapd033b972-71: No such device Oct 14 06:18:42 localhost nova_compute[297686]: 2025-10-14 10:18:42.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:42 localhost nova_compute[297686]: 2025-10-14 10:18:42.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:43 localhost podman[335826]: Oct 14 06:18:43 localhost podman[335826]: 2025-10-14 10:18:43.342541234 +0000 UTC m=+0.093396814 container create 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:18:43 localhost systemd[1]: Started libpod-conmon-47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83.scope. Oct 14 06:18:43 localhost systemd[1]: Started libcrun container. Oct 14 06:18:43 localhost podman[335826]: 2025-10-14 10:18:43.298283303 +0000 UTC m=+0.049138913 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:18:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdab197bd75ff423d42ec2fd5406104cb38eace19cc6c156aa47c2ccd9d506a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:18:43 localhost podman[335826]: 2025-10-14 10:18:43.408490226 +0000 UTC m=+0.159345806 container init 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:18:43 localhost podman[335826]: 2025-10-14 10:18:43.417065662 +0000 UTC m=+0.167921262 container start 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 06:18:43 localhost dnsmasq[335844]: started, version 2.85 cachesize 150 Oct 14 06:18:43 localhost dnsmasq[335844]: DNS service limited to local subnets Oct 14 06:18:43 localhost dnsmasq[335844]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:18:43 localhost dnsmasq[335844]: warning: no upstream servers configured Oct 14 06:18:43 localhost dnsmasq-dhcp[335844]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:18:43 localhost dnsmasq[335844]: read /var/lib/neutron/dhcp/0c2db83b-8b79-46f0-a8b3-1588ce43657f/addn_hosts - 0 addresses Oct 14 06:18:43 localhost dnsmasq-dhcp[335844]: read /var/lib/neutron/dhcp/0c2db83b-8b79-46f0-a8b3-1588ce43657f/host Oct 14 06:18:43 localhost dnsmasq-dhcp[335844]: read /var/lib/neutron/dhcp/0c2db83b-8b79-46f0-a8b3-1588ce43657f/opts Oct 14 06:18:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:43.604 271987 INFO neutron.agent.dhcp.agent [None req-69c722fb-9cdb-4e2b-a28a-6a8fbe11e1c3 - - - - - -] DHCP configuration for ports {'e44310a8-820d-455c-a031-1a4cfb24d9e9'} is completed#033[00m Oct 14 06:18:44 localhost ovn_controller[157396]: 2025-10-14T10:18:44Z|00269|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:44 localhost nova_compute[297686]: 2025-10-14 10:18:44.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:44 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:18:44 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:18:44 localhost podman[335860]: 2025-10-14 10:18:44.138825297 +0000 UTC m=+0.048950337 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:18:44 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:18:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:18:44 localhost nova_compute[297686]: 2025-10-14 10:18:44.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:44 localhost ovn_controller[157396]: 2025-10-14T10:18:44Z|00270|binding|INFO|Removing iface tapd033b972-71 ovn-installed in OVS Oct 14 06:18:44 localhost ovn_controller[157396]: 2025-10-14T10:18:44Z|00271|binding|INFO|Removing lport d033b972-713b-4051-bed1-752e92d3ddd9 ovn-installed in OVS Oct 14 06:18:44 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:44.590 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port e4f0a92e-37eb-403a-9792-315be43a737d with type ""#033[00m Oct 14 06:18:44 localhost nova_compute[297686]: 2025-10-14 10:18:44.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:44 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:44.591 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-0c2db83b-8b79-46f0-a8b3-1588ce43657f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c2db83b-8b79-46f0-a8b3-1588ce43657f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c4e628039e94868b41efbbdc1307f19', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=380a463c-fc4f-4a2d-afa1-2390efa644c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d033b972-713b-4051-bed1-752e92d3ddd9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:18:44 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:44.594 163055 INFO neutron.agent.ovn.metadata.agent [-] Port d033b972-713b-4051-bed1-752e92d3ddd9 in datapath 0c2db83b-8b79-46f0-a8b3-1588ce43657f unbound from our chassis#033[00m Oct 14 06:18:44 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:44.597 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c2db83b-8b79-46f0-a8b3-1588ce43657f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:18:44 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:44.598 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[1b134c7c-2326-41d5-9505-c199228a193d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:18:44 localhost nova_compute[297686]: 2025-10-14 10:18:44.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:44 localhost kernel: device tapd033b972-71 left promiscuous mode Oct 14 06:18:44 localhost nova_compute[297686]: 2025-10-14 10:18:44.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e184 e184: 6 total, 6 up, 6 in Oct 14 06:18:45 localhost dnsmasq[335844]: read /var/lib/neutron/dhcp/0c2db83b-8b79-46f0-a8b3-1588ce43657f/addn_hosts - 0 addresses Oct 14 06:18:45 localhost dnsmasq-dhcp[335844]: read /var/lib/neutron/dhcp/0c2db83b-8b79-46f0-a8b3-1588ce43657f/host Oct 14 06:18:45 localhost dnsmasq-dhcp[335844]: read /var/lib/neutron/dhcp/0c2db83b-8b79-46f0-a8b3-1588ce43657f/opts Oct 14 06:18:45 localhost podman[335900]: 2025-10-14 10:18:45.159624044 +0000 UTC m=+0.067742920 container kill 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent [None req-4fecfac4-7fdf-4c4f-97f9-42776e26ad20 - - - - - -] Unable to reload_allocations dhcp for 0c2db83b-8b79-46f0-a8b3-1588ce43657f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd033b972-71 not found in namespace qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f. Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent return fut.result() Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent return self.__get_result() Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent raise self._exception Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd033b972-71 not found in namespace qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f. Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.192 271987 ERROR neutron.agent.dhcp.agent #033[00m Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.195 271987 INFO neutron.agent.dhcp.agent [None req-14bc63ee-10da-41c7-bcc2-438264b13d3e - - - - - -] Synchronizing state#033[00m Oct 14 06:18:45 localhost nova_compute[297686]: 2025-10-14 10:18:45.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.423 271987 INFO neutron.agent.dhcp.agent [None req-1b3116d6-0ac8-4851-aef8-6ccac0b652b9 - - - - - -] All active networks have been fetched through RPC.#033[00m Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.424 271987 INFO neutron.agent.dhcp.agent [-] Starting network 0c2db83b-8b79-46f0-a8b3-1588ce43657f dhcp configuration#033[00m Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.425 271987 INFO neutron.agent.dhcp.agent [-] Finished network 0c2db83b-8b79-46f0-a8b3-1588ce43657f dhcp configuration#033[00m Oct 14 06:18:45 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:18:45.426 271987 INFO neutron.agent.dhcp.agent [None req-1b3116d6-0ac8-4851-aef8-6ccac0b652b9 - - - - - -] Synchronizing state complete#033[00m Oct 14 06:18:45 localhost ovn_controller[157396]: 2025-10-14T10:18:45Z|00272|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:18:45 localhost nova_compute[297686]: 2025-10-14 10:18:45.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:45 localhost dnsmasq[335844]: exiting on receipt of SIGTERM Oct 14 06:18:45 localhost systemd[1]: tmp-crun.mJl3tI.mount: Deactivated successfully. Oct 14 06:18:45 localhost podman[335930]: 2025-10-14 10:18:45.718472623 +0000 UTC m=+0.063812927 container kill 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:45 localhost systemd[1]: libpod-47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83.scope: Deactivated successfully. Oct 14 06:18:45 localhost podman[335945]: 2025-10-14 10:18:45.785432887 +0000 UTC m=+0.049719431 container died 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true) Oct 14 06:18:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83-userdata-shm.mount: Deactivated successfully. Oct 14 06:18:45 localhost systemd[1]: var-lib-containers-storage-overlay-fdab197bd75ff423d42ec2fd5406104cb38eace19cc6c156aa47c2ccd9d506a3-merged.mount: Deactivated successfully. Oct 14 06:18:45 localhost podman[335945]: 2025-10-14 10:18:45.83457786 +0000 UTC m=+0.098864364 container remove 47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c2db83b-8b79-46f0-a8b3-1588ce43657f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:45 localhost systemd[1]: libpod-conmon-47524237d6dcc819d27fb09cd5cb2d7bc56542fb26ff2357d89d69cb40b10c83.scope: Deactivated successfully. Oct 14 06:18:46 localhost systemd[1]: run-netns-qdhcp\x2d0c2db83b\x2d8b79\x2d46f0\x2da8b3\x2d1588ce43657f.mount: Deactivated successfully. Oct 14 06:18:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:18:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:18:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:18:47 localhost systemd[1]: tmp-crun.RQmieM.mount: Deactivated successfully. Oct 14 06:18:47 localhost podman[335971]: 2025-10-14 10:18:47.751879682 +0000 UTC m=+0.088709159 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:18:47 localhost podman[335971]: 2025-10-14 10:18:47.791230361 +0000 UTC m=+0.128059838 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Oct 14 06:18:47 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:18:47 localhost podman[335970]: 2025-10-14 10:18:47.797497695 +0000 UTC m=+0.134487897 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:18:47 localhost podman[335969]: 2025-10-14 10:18:47.870582038 +0000 UTC m=+0.207963732 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:18:47 localhost podman[335970]: 2025-10-14 10:18:47.884810529 +0000 UTC m=+0.221800761 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:18:47 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:18:47 localhost podman[335969]: 2025-10-14 10:18:47.909967268 +0000 UTC m=+0.247348952 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:18:48 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:18:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:18:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3390786556' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:18:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:18:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3390786556' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:18:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:18:49 localhost nova_compute[297686]: 2025-10-14 10:18:49.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.824 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.831 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbb560db-9022-4ed9-8fff-1b24d8466c17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.825275', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d3c36ce-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': 'ece046100afbe975e5764c287df0ca7fbf52c544f1eb4063a99fb5a8f92898f5'}]}, 'timestamp': '2025-10-14 10:18:49.832268', '_unique_id': '39d6759923b74359a0b8129258516fcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.834 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.836 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e18f69d4-a7fe-445f-8c09-d5261dff7e27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.836211', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d3cf028-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': 'c37f89d1de4ad4fb05c1d71a3a5664efffdaecce3251b2c012f5465df568c5b2'}]}, 'timestamp': '2025-10-14 10:18:49.836967', '_unique_id': 'e3c481dbcc3e42e885a90196ee12220a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.838 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.840 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.853 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.854 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c18dbe8-6e56-431c-a8b4-651bc7575d92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.840639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d3f9b16-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.033450896, 'message_signature': '7590567d6ae9eac370db6f455ba4bbcbcec2044861d06a91928a35d482e98d16'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.840639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d3fb70e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.033450896, 'message_signature': '72aa82ffdcd2c9555395cba6a4caa1bfc0100304eeaeab2a3aed6cb47967620c'}]}, 'timestamp': '2025-10-14 10:18:49.855080', '_unique_id': '8e245cf766684c838ad76c6a3c623aef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.856 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.858 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da691eb9-7492-440e-94b3-9e49f3cce01d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.858138', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d404944-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': '4f231b560e1281e95376f71833f5966e098de96ab591e25f5b744cbc78eedd94'}]}, 'timestamp': '2025-10-14 10:18:49.858880', '_unique_id': '19ff7d50c1284d56b06c99a85a8a3edf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.862 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0775e34d-e9f1-4fa2-b554-ca7cdc3dc83e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.862120', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d40e534-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': '3dacb492ec1f59142e5ad44a61f312ea665f5cb5583d6a75ba3b085e7349c820'}]}, 'timestamp': '2025-10-14 10:18:49.862896', '_unique_id': 'a15ebd62717b478db682277c44c11a23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.865 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.865 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '865df35b-5e85-400c-93cb-077cddef8e86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.865895', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d417e90-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': '1bb8acba19230a88e6e832d4dbe4abef05665985156358709b53921c10f48234'}]}, 'timestamp': '2025-10-14 10:18:49.866823', '_unique_id': 'c47a38b8ea9344738f53d6860fe105a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.891 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.892 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '589de498-7c98-42d5-8db2-5f4a27362e24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.870638', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d456e42-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': 'db9a3ea5ba0d6397bfd1768a59f0f689983c8a4df10c599615108ed584754feb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.870638', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d4589c2-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': 'cba5cdb199b8b1be5395a0f8ad8d9a05b9e05d6754a9b7bb1ae44ce03f5c0580'}]}, 'timestamp': '2025-10-14 10:18:49.893269', '_unique_id': '0051e6deb5314f62a99aa635c9207fb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.894 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.896 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.896 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f11dabd-9979-4be6-bc06-61d85e107587', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.896360', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d461a40-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': 'd4ce9de9ccfed0b9622b6a5dc4945d30c7a0f487082e21eaf717d6a176655937'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.896360', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d462dd2-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': 'edfc5ede688254a0dbd291fd1a6c5562123ab8d98b9449542e9df2581b09b0dc'}]}, 'timestamp': '2025-10-14 10:18:49.897322', '_unique_id': '921eab9bf97a435792aba5678727c75e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8a4090c-6e59-4431-b148-f6bcc0628b7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.899944', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d46a960-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': 'e456e4ab7bd67fb45b793a104f763006270627c0aba1cc0d15b7a3e792e6d972'}]}, 'timestamp': '2025-10-14 10:18:49.900520', '_unique_id': '9d8fd790da8e45c99a79184ef7aded95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.903 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ca6845e-44cf-46a4-aed0-9c262d29e4d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.902852', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d4718b4-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': '8a132a11953894946cd62380aa5c7c193216c39d22b71765e5ce279113e7238b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.902852', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d472ffc-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': 'cae12d04c1ba9d588a3b99d9d5cb4b9780a17f0571c6d70aabd730a015b42c73'}]}, 'timestamp': '2025-10-14 10:18:49.904027', '_unique_id': '9e25368deca74985b01d497b813c7754'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd93d16d-6774-450c-b878-8230eb8a9c37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.906530', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d47a9dc-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': '89fb9915163c63a2859ae14235103abe26f00045a7636a689d4472442471377b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.906530', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d47bada-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': '0a0203d4aee713e7ef07c3a0de173d385cc5172bedf57f642b21fc713d214d9d'}]}, 'timestamp': '2025-10-14 10:18:49.907568', '_unique_id': '2863e06eb13a40d1a7d0098e1bede955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.910 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b68cc9e-b166-4741-8f86-344ebed5a34d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.910373', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d48416c-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': 'e928301ce884b6c93ae8b295f0dc95dddf3110b7388540c1135f26de46ba5ae5'}]}, 'timestamp': '2025-10-14 10:18:49.911053', '_unique_id': 'f8dd19037bdd4e77abe142c9050d9c8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.914 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d6e8fed-f9b0-485c-9715-1ea51cbcd8c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.914337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d48dab4-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': '42ade4754d8058b5618e6f889f4714d68c8fba3da63b44069a39bc5932a6debb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.914337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d48f3a0-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': '4fb7fd3602670a28d36aeb585c0d28d97a61f8c85ed70518d364ba07a6321f73'}]}, 'timestamp': '2025-10-14 10:18:49.915572', '_unique_id': '25f3d0aba87e4aea8913b25e3a72e478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bf9bc15-e5b1-4620-bcfc-efd9c498bc76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.918585', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d49837e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': '92c2eeee514fd081a1dfbb6e6391ad167cc23598f3b2a5b29605e16b5f995813'}]}, 'timestamp': '2025-10-14 10:18:49.919319', '_unique_id': '9a35071ac58c4f5aa3fe53c23265b02d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.922 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.946 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa3dc981-cd9c-4d29-9bf1-b98f0efe0583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:18:49.922456', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2d4dc9ca-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.139207151, 'message_signature': 'be3d9ffda5ec4577b59cf700fb22118a8a71019643dd10bf2e623c21bfb8f27a'}]}, 'timestamp': '2025-10-14 10:18:49.947219', '_unique_id': 'be7823c5c9bc4264a2ccbd8a5db9d5e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.949 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.949 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0494f79-d8e6-4839-965f-513c08536e10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.949575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d4e3842-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.033450896, 'message_signature': 'f618ced45a4cbe3e625a1b3d6599351f2abb27c6d807b6c89ba23c49c3352b79'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.949575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d4e4760-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.033450896, 'message_signature': 'dd7da9b622d58230044b3e81887a11acfe58bfbf73c8a3b93b72707ad79f3e6c'}]}, 'timestamp': '2025-10-14 10:18:49.950333', '_unique_id': '9798a92713454e90a751e0484ec48a8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.950 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.951 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e5547d2-862b-464b-a784-7dc1b09a8565', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.951817', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d4e8ce8-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': '48301806e25189980be98e7e40470cbe2d97c1a5d4d66618a3b34c7c04f2b7d1'}]}, 'timestamp': '2025-10-14 10:18:49.952124', '_unique_id': 'bf2f38937f5547af8a2affc38d4c99e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 17480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15a0b259-36c2-4e8b-a75e-b1cb69376021', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17480000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:18:49.953475', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2d4ed04a-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.139207151, 'message_signature': '6c37888b87af7234dee70ada8a25fe5e8d3c1ce465bfd90bb9e9066c1df7a12e'}]}, 'timestamp': '2025-10-14 10:18:49.953843', '_unique_id': '56fcb44f0c344323a6dc2bcafe5473e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.955 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71e92b57-4698-4b7e-940d-65264a8a25e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:18:49.955237', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '2d4f1244-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.017979027, 'message_signature': '463e8671387fe908fdb5a7c1ecd36199200400f2ce5155e445e9108f3726712c'}]}, 'timestamp': '2025-10-14 10:18:49.955537', '_unique_id': '03230ffd22414f36acf4fce0f98565ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.956 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.957 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b258819e-bd4e-4cdd-9adf-84981eae6e8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.956891', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d4f52a4-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.033450896, 'message_signature': '5db2b13206936c3956a62eb54ed8ad1f9e381142f5d41f1789cb6f89d6d5a01b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.956891', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d4f5d1c-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.033450896, 'message_signature': 'd779a7ffaa09820a66c7029ff27c04cc3cbb3075932dfe762b82f0d2c4965380'}]}, 'timestamp': '2025-10-14 10:18:49.957435', '_unique_id': 'e02e837237194bd480d8d415b11d2841'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.959 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.959 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf284b55-4619-4372-97e0-100b59f87109', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:18:49.958997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d4fa506-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': 'c40374fc81429953eca6c3f62d933cbd26369219a6cb10b2536c8339b81bf39f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:18:49.958997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d4faf7e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13146.063429614, 'message_signature': '5fc7877d80b1496665becf4203aa221592693e9eed087082d68402b0b7e9e187'}]}, 'timestamp': '2025-10-14 10:18:49.959544', '_unique_id': 'c4c3440f114e46cca71d72e2da590684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:18:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:18:49.960 12 ERROR oslo_messaging.notify.messaging Oct 14 06:18:50 localhost nova_compute[297686]: 2025-10-14 10:18:50.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e185 e185: 6 total, 6 up, 6 in Oct 14 06:18:52 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e186 e186: 6 total, 6 up, 6 in Oct 14 06:18:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:18:54 localhost nova_compute[297686]: 2025-10-14 10:18:54.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:18:54 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/318614661' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:18:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:18:54 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/318614661' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:18:55 localhost nova_compute[297686]: 2025-10-14 10:18:55.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:18:55 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/998062430' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:18:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:18:55 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/998062430' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:18:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:57.786 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:18:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:57.787 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:18:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:18:57.787 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:18:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:18:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:18:58 localhost podman[336030]: 2025-10-14 10:18:58.248809929 +0000 UTC m=+0.082628990 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:18:58 localhost podman[336030]: 2025-10-14 10:18:58.262214985 +0000 UTC m=+0.096034046 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:18:58 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:18:58 localhost podman[248187]: time="2025-10-14T10:18:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:18:58 localhost podman[336031]: 2025-10-14 10:18:58.317845918 +0000 UTC m=+0.148392607 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:18:58 localhost podman[248187]: @ - - [14/Oct/2025:10:18:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:18:58 localhost podman[336031]: 2025-10-14 10:18:58.453234551 +0000 UTC m=+0.283781250 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:18:58 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:18:58 localhost podman[248187]: @ - - [14/Oct/2025:10:18:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19863 "" "Go-http-client/1.1" Oct 14 06:18:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:18:59 localhost nova_compute[297686]: 2025-10-14 10:18:59.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:18:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e187 e187: 6 total, 6 up, 6 in Oct 14 06:19:00 localhost nova_compute[297686]: 2025-10-14 10:19:00.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:01 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:01 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/49fe1b10-39aa-4c6d-b588-71b3f9300e29/d37910f0-cb97-4839-8350-ba32cb1ee48b", "osd", "allow rw pool=manila_data namespace=fsvolumens_49fe1b10-39aa-4c6d-b588-71b3f9300e29", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:01 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/49fe1b10-39aa-4c6d-b588-71b3f9300e29/d37910f0-cb97-4839-8350-ba32cb1ee48b", "osd", "allow rw pool=manila_data namespace=fsvolumens_49fe1b10-39aa-4c6d-b588-71b3f9300e29", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:01 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:01.935 2 INFO neutron.agent.securitygroups_rpc [None req-4d772f3b-ace0-4ef8-a89a-962408816e43 c858e15b48804013a3e03a1551996d0b f51b17b0ed0a40019c4fcd777d26b72d - - default default] Security group rule updated ['d9ec2c86-56aa-409c-8be6-91e4f9464bbb']#033[00m Oct 14 06:19:02 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:02.862 2 INFO neutron.agent.securitygroups_rpc [None req-68296946-0c64-4977-938d-88b8a7ab90fb bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:04.403 2 INFO neutron.agent.securitygroups_rpc [None req-3a636727-ac42-4163-b371-317e98c4e3cd bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:04.491 2 INFO neutron.agent.securitygroups_rpc [None req-3a636727-ac42-4163-b371-317e98c4e3cd bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:04 localhost nova_compute[297686]: 2025-10-14 10:19:04.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:19:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:19:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:19:04 localhost podman[336072]: 2025-10-14 10:19:04.765034245 +0000 UTC m=+0.093193237 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:19:04 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:04.800 2 INFO neutron.agent.securitygroups_rpc [None req-2b0cc61c-bbef-4d46-aa4f-f455510212c2 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:04 localhost podman[336072]: 2025-10-14 10:19:04.802172065 +0000 UTC m=+0.130331147 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:19:04 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:19:04 localhost podman[336071]: 2025-10-14 10:19:04.860247514 +0000 UTC m=+0.191912464 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:19:04 localhost podman[336070]: 2025-10-14 10:19:04.809165672 +0000 UTC m=+0.142319719 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:19:04 localhost podman[336071]: 2025-10-14 10:19:04.867926062 +0000 UTC m=+0.199591052 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:19:04 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:19:04 localhost podman[336070]: 2025-10-14 10:19:04.944332499 +0000 UTC m=+0.277486556 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:19:04 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:19:05 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:05.179 2 INFO neutron.agent.securitygroups_rpc [None req-2944b2cf-abf2-4c66-9fd7-e0cbff7f7cb3 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:05 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:05.196 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:05 localhost nova_compute[297686]: 2025-10-14 10:19:05.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:05 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:05 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:05 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:05 localhost systemd[1]: tmp-crun.ymWZlI.mount: Deactivated successfully. Oct 14 06:19:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:06 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/31555708' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:06 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/31555708' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:07 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1376040352' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:07 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1376040352' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:07 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:07.878 2 INFO neutron.agent.securitygroups_rpc [None req-95f9645e-0b06-4e03-a9e5-66080516d29e bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:08 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:08.353 2 INFO neutron.agent.securitygroups_rpc [None req-e4805ede-8e41-4486-ae97-356dc9144e24 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:08 localhost openstack_network_exporter[250374]: ERROR 10:19:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:19:08 localhost openstack_network_exporter[250374]: ERROR 10:19:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:19:08 localhost openstack_network_exporter[250374]: ERROR 10:19:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:19:08 localhost openstack_network_exporter[250374]: ERROR 10:19:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:19:08 localhost openstack_network_exporter[250374]: Oct 14 06:19:08 localhost openstack_network_exporter[250374]: ERROR 10:19:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:19:08 localhost openstack_network_exporter[250374]: Oct 14 06:19:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1236076424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1236076424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:09 localhost nova_compute[297686]: 2025-10-14 10:19:09.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:10 localhost nova_compute[297686]: 2025-10-14 10:19:10.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:11 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:11.097 271987 INFO neutron.agent.linux.ip_lib [None req-3783229f-1947-4b8e-b9ad-3e6608c8eb68 - - - - - -] Device tap143376a3-76 cannot be used as it has no MAC address#033[00m Oct 14 06:19:11 localhost nova_compute[297686]: 2025-10-14 10:19:11.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:11 localhost kernel: device tap143376a3-76 entered promiscuous mode Oct 14 06:19:11 localhost NetworkManager[5977]: [1760437151.1830] manager: (tap143376a3-76): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Oct 14 06:19:11 localhost nova_compute[297686]: 2025-10-14 10:19:11.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:11 localhost ovn_controller[157396]: 2025-10-14T10:19:11Z|00273|binding|INFO|Claiming lport 143376a3-7666-4627-9c39-2d35bc2a2886 for this chassis. Oct 14 06:19:11 localhost ovn_controller[157396]: 2025-10-14T10:19:11Z|00274|binding|INFO|143376a3-7666-4627-9c39-2d35bc2a2886: Claiming unknown Oct 14 06:19:11 localhost systemd-udevd[336141]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:19:11 localhost ovn_controller[157396]: 2025-10-14T10:19:11Z|00275|binding|INFO|Setting lport 143376a3-7666-4627-9c39-2d35bc2a2886 ovn-installed in OVS Oct 14 06:19:11 localhost ovn_controller[157396]: 2025-10-14T10:19:11Z|00276|binding|INFO|Setting lport 143376a3-7666-4627-9c39-2d35bc2a2886 up in Southbound Oct 14 06:19:11 localhost nova_compute[297686]: 2025-10-14 10:19:11.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:11.214 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-ee71b50b-d7c3-42f1-9021-62ecf9939b16', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee71b50b-d7c3-42f1-9021-62ecf9939b16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05d5863d-9ce2-4d00-9e1e-04ed24a26225, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=143376a3-7666-4627-9c39-2d35bc2a2886) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:11.218 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 143376a3-7666-4627-9c39-2d35bc2a2886 in datapath ee71b50b-d7c3-42f1-9021-62ecf9939b16 bound to our chassis#033[00m Oct 14 06:19:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:11.220 163055 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ee71b50b-d7c3-42f1-9021-62ecf9939b16 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:11.222 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[4743a88f-1a2d-4e15-9e99-f2a40c951fb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost journal[237477]: ethtool ioctl error on tap143376a3-76: No such device Oct 14 06:19:11 localhost nova_compute[297686]: 2025-10-14 10:19:11.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:11 localhost nova_compute[297686]: 2025-10-14 10:19:11.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:11 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:11 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/d87b2b90-a5f1-4681-ae04-3fa3af31d3cb/8dddc9a5-74bf-4406-b386-a72917ae6624", "osd", "allow rw pool=manila_data namespace=fsvolumens_d87b2b90-a5f1-4681-ae04-3fa3af31d3cb", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:11 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/d87b2b90-a5f1-4681-ae04-3fa3af31d3cb/8dddc9a5-74bf-4406-b386-a72917ae6624", "osd", "allow rw pool=manila_data namespace=fsvolumens_d87b2b90-a5f1-4681-ae04-3fa3af31d3cb", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:12 localhost podman[336212]: Oct 14 06:19:12 localhost podman[336212]: 2025-10-14 10:19:12.28242673 +0000 UTC m=+0.091620048 container create 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:19:12 localhost systemd[1]: Started libpod-conmon-4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e.scope. Oct 14 06:19:12 localhost systemd[1]: tmp-crun.gBxVbO.mount: Deactivated successfully. Oct 14 06:19:12 localhost systemd[1]: Started libcrun container. Oct 14 06:19:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c556450a247ea73c22fe61ed6238a007744c15704e6d07dd0e0abe81f28a8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:12 localhost podman[336212]: 2025-10-14 10:19:12.244086263 +0000 UTC m=+0.053279691 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:12 localhost podman[336212]: 2025-10-14 10:19:12.344350099 +0000 UTC m=+0.153543417 container init 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:19:12 localhost podman[336212]: 2025-10-14 10:19:12.35441609 +0000 UTC m=+0.163609408 container start 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:12 localhost dnsmasq[336231]: started, version 2.85 cachesize 150 Oct 14 06:19:12 localhost dnsmasq[336231]: DNS service limited to local subnets Oct 14 06:19:12 localhost dnsmasq[336231]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:12 localhost dnsmasq[336231]: warning: no upstream servers configured Oct 14 06:19:12 localhost dnsmasq-dhcp[336231]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:19:12 localhost dnsmasq[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/addn_hosts - 0 addresses Oct 14 06:19:12 localhost dnsmasq-dhcp[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/host Oct 14 06:19:12 localhost dnsmasq-dhcp[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/opts Oct 14 06:19:12 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:12.477 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:19:12Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8369068c-56c4-45a8-874e-cbb881d83bf7, ip_allocation=immediate, mac_address=fa:16:3e:3e:04:eb, name=tempest-PortsTestJSON-1536091557, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:19:09Z, description=, dns_domain=, id=ee71b50b-d7c3-42f1-9021-62ecf9939b16, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-167826647, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58923, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2836, status=ACTIVE, subnets=['00cc5581-26f1-4a9c-b864-989ddce22f62'], tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:10Z, vlan_transparent=None, network_id=ee71b50b-d7c3-42f1-9021-62ecf9939b16, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2861, status=DOWN, tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:12Z on network ee71b50b-d7c3-42f1-9021-62ecf9939b16#033[00m Oct 14 06:19:12 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:12.545 271987 INFO neutron.agent.dhcp.agent [None req-2ec6fe84-792d-4ac0-8b90-b2641a69a4d7 - - - - - -] DHCP configuration for ports {'f90a6917-d9b0-45ab-a6ab-edeb20a6689d'} is completed#033[00m Oct 14 06:19:12 localhost podman[336249]: 2025-10-14 10:19:12.727895928 +0000 UTC m=+0.058617106 container kill 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:19:12 localhost dnsmasq[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/addn_hosts - 1 addresses Oct 14 06:19:12 localhost dnsmasq-dhcp[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/host Oct 14 06:19:12 localhost dnsmasq-dhcp[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/opts Oct 14 06:19:13 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:13.004 271987 INFO neutron.agent.dhcp.agent [None req-3e630f9a-352a-4d65-a70f-6114faae74b3 - - - - - -] DHCP configuration for ports {'8369068c-56c4-45a8-874e-cbb881d83bf7'} is completed#033[00m Oct 14 06:19:13 localhost dnsmasq[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/addn_hosts - 0 addresses Oct 14 06:19:13 localhost dnsmasq-dhcp[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/host Oct 14 06:19:13 localhost podman[336287]: 2025-10-14 10:19:13.479798097 +0000 UTC m=+0.052492567 container kill 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:19:13 localhost dnsmasq-dhcp[336231]: read /var/lib/neutron/dhcp/ee71b50b-d7c3-42f1-9021-62ecf9939b16/opts Oct 14 06:19:13 localhost dnsmasq[336231]: exiting on receipt of SIGTERM Oct 14 06:19:13 localhost podman[336324]: 2025-10-14 10:19:13.862738658 +0000 UTC m=+0.040615849 container kill 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:19:13 localhost systemd[1]: libpod-4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e.scope: Deactivated successfully. Oct 14 06:19:13 localhost podman[336337]: 2025-10-14 10:19:13.923930333 +0000 UTC m=+0.048715280 container died 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:19:13 localhost ovn_controller[157396]: 2025-10-14T10:19:13Z|00277|binding|INFO|Removing iface tap143376a3-76 ovn-installed in OVS Oct 14 06:19:13 localhost ovn_controller[157396]: 2025-10-14T10:19:13Z|00278|binding|INFO|Removing lport 143376a3-7666-4627-9c39-2d35bc2a2886 ovn-installed in OVS Oct 14 06:19:13 localhost nova_compute[297686]: 2025-10-14 10:19:13.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:13.935 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 73bb262c-d05f-410f-8b14-c51335bc9263 with type ""#033[00m Oct 14 06:19:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:13.937 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-ee71b50b-d7c3-42f1-9021-62ecf9939b16', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee71b50b-d7c3-42f1-9021-62ecf9939b16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05d5863d-9ce2-4d00-9e1e-04ed24a26225, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=143376a3-7666-4627-9c39-2d35bc2a2886) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:13.940 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 143376a3-7666-4627-9c39-2d35bc2a2886 in datapath ee71b50b-d7c3-42f1-9021-62ecf9939b16 unbound from our chassis#033[00m Oct 14 06:19:13 localhost nova_compute[297686]: 2025-10-14 10:19:13.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:13.943 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee71b50b-d7c3-42f1-9021-62ecf9939b16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:19:13 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:13.944 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbaf80b-2f15-4aaf-a36c-b4ae801a1473]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:13 localhost podman[336337]: 2025-10-14 10:19:13.959496665 +0000 UTC m=+0.084281622 container cleanup 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2) Oct 14 06:19:13 localhost systemd[1]: libpod-conmon-4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e.scope: Deactivated successfully. Oct 14 06:19:14 localhost podman[336339]: 2025-10-14 10:19:14.007589734 +0000 UTC m=+0.126540050 container remove 4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ee71b50b-d7c3-42f1-9021-62ecf9939b16, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:14 localhost nova_compute[297686]: 2025-10-14 10:19:14.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:14 localhost kernel: device tap143376a3-76 left promiscuous mode Oct 14 06:19:14 localhost nova_compute[297686]: 2025-10-14 10:19:14.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:14.080 271987 INFO neutron.agent.dhcp.agent [None req-ed41a107-8b3f-4168-8db2-2f2c0be9f6b6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:14 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:14.091 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:14 localhost systemd[1]: var-lib-containers-storage-overlay-88c556450a247ea73c22fe61ed6238a007744c15704e6d07dd0e0abe81f28a8a-merged.mount: Deactivated successfully. Oct 14 06:19:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4fd876d396ee8a8d7b45b17b20aa4056063a4579ca663d4cf319fe6ddd6f586e-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:14 localhost systemd[1]: run-netns-qdhcp\x2dee71b50b\x2dd7c3\x2d42f1\x2d9021\x2d62ecf9939b16.mount: Deactivated successfully. Oct 14 06:19:14 localhost ovn_controller[157396]: 2025-10-14T10:19:14Z|00279|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:19:14 localhost nova_compute[297686]: 2025-10-14 10:19:14.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:14.511 2 INFO neutron.agent.securitygroups_rpc [None req-88adcdd5-d424-4851-9b34-ed74fafb8707 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:14 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:14 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:14 localhost nova_compute[297686]: 2025-10-14 10:19:14.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:14 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:14.979 2 INFO neutron.agent.securitygroups_rpc [None req-e74f8eb6-5583-4f88-ba36-feb9c4fb0b0b bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:15.010 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:15.278 2 INFO neutron.agent.securitygroups_rpc [None req-718d9441-87bf-4ffb-9555-9941a2a03947 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:15 localhost nova_compute[297686]: 2025-10-14 10:19:15.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:15 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:15.591 2 INFO neutron.agent.securitygroups_rpc [None req-0e2311c8-85a8-441e-8500-ba38920b937a bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:15 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:15.618 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:15 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:15.862 2 INFO neutron.agent.securitygroups_rpc [None req-8fbdee58-d50f-4ab3-9f88-3eacdf0ebeaa bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:16 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:16.480 2 INFO neutron.agent.securitygroups_rpc [None req-7d0fd557-94a4-4212-93b1-9fe908adc855 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:16 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:16.502 271987 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:19:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:19:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:19:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:18.682 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:18 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:18.684 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:19:18 localhost nova_compute[297686]: 2025-10-14 10:19:18.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:18 localhost systemd[1]: tmp-crun.zxlDG8.mount: Deactivated successfully. Oct 14 06:19:18 localhost podman[336367]: 2025-10-14 10:19:18.762176838 +0000 UTC m=+0.094071125 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:19:18 localhost podman[336366]: 2025-10-14 10:19:18.769374831 +0000 UTC m=+0.100653759 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS) Oct 14 06:19:18 localhost podman[336367]: 2025-10-14 10:19:18.779208106 +0000 UTC m=+0.111102393 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 06:19:18 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:19:18 localhost podman[336366]: 2025-10-14 10:19:18.808110671 +0000 UTC m=+0.139389589 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:19:18 localhost podman[336368]: 2025-10-14 10:19:18.82263644 +0000 UTC m=+0.145028283 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:19:18 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:19:18 localhost podman[336368]: 2025-10-14 10:19:18.8371611 +0000 UTC m=+0.159552963 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:19:18 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:19:19 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:19.040 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:18:bc 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9cb5f697-e34c-42db-aca7-5e486551dd6a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cb5f697-e34c-42db-aca7-5e486551dd6a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcfe5f47-18ab-4556-b7e8-874d7a7daff0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9c03674c-50ca-4ed4-9dac-c94e9dd6244c) old=Port_Binding(mac=['fa:16:3e:9f:18:bc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9cb5f697-e34c-42db-aca7-5e486551dd6a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cb5f697-e34c-42db-aca7-5e486551dd6a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:19 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:19.042 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9c03674c-50ca-4ed4-9dac-c94e9dd6244c in datapath 9cb5f697-e34c-42db-aca7-5e486551dd6a updated#033[00m Oct 14 06:19:19 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:19.044 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cb5f697-e34c-42db-aca7-5e486551dd6a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:19:19 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:19.045 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[12147c52-34da-4b3c-90b9-cc7a9dee316f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:19 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:19.706 2 INFO neutron.agent.securitygroups_rpc [None req-a27717d6-07e5-4b5e-baa0-f1dd7df1cd15 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:19 localhost nova_compute[297686]: 2025-10-14 10:19:19.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:20 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:20.311 2 INFO neutron.agent.securitygroups_rpc [None req-1cd2d25e-90be-4ec8-893b-f4c70d148c82 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:20 localhost nova_compute[297686]: 2025-10-14 10:19:20.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:20 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:20.873 2 INFO neutron.agent.securitygroups_rpc [None req-1e9de31c-96f9-4ac3-afa0-468e18743a4e bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e188 e188: 6 total, 6 up, 6 in Oct 14 06:19:21 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:21.217 2 INFO neutron.agent.securitygroups_rpc [None req-1e8fc2ef-2179-4fb2-8c12-8f02c2ba1726 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:21 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:21 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/3e440449-fe29-4222-837b-3115389bed13/42adabe8-acdf-4e4b-8d5e-5994bafb229a", "osd", "allow rw pool=manila_data namespace=fsvolumens_3e440449-fe29-4222-837b-3115389bed13", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:21 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/3e440449-fe29-4222-837b-3115389bed13/42adabe8-acdf-4e4b-8d5e-5994bafb229a", "osd", "allow rw pool=manila_data namespace=fsvolumens_3e440449-fe29-4222-837b-3115389bed13", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:21 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:21.685 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:19:22 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:22 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/616165160' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:22 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:22 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/616165160' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:24 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:24.075 2 INFO neutron.agent.securitygroups_rpc [None req-26a85657-c795-47cc-9b5a-acc54e4fee3f bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:24 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:24.525 2 INFO neutron.agent.securitygroups_rpc [None req-8c653634-5465-440b-8a86-5c6019dd6e5b bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:24 localhost nova_compute[297686]: 2025-10-14 10:19:24.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:25 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3468620068' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:25 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3468620068' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:25.065 2 INFO neutron.agent.securitygroups_rpc [None req-7bf340a9-d2d9-4152-b37b-1f681301b926 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:25 localhost nova_compute[297686]: 2025-10-14 10:19:25.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:25 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:25.432 2 INFO neutron.agent.securitygroups_rpc [None req-f3aa15d7-c869-4941-8ccc-15166138b527 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:26 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e189 e189: 6 total, 6 up, 6 in Oct 14 06:19:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:28 localhost podman[248187]: time="2025-10-14T10:19:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:19:28 localhost podman[248187]: @ - - [14/Oct/2025:10:19:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:19:28 localhost podman[248187]: @ - - [14/Oct/2025:10:19:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19866 "" "Go-http-client/1.1" Oct 14 06:19:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:19:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:19:28 localhost systemd[1]: tmp-crun.SeYFN1.mount: Deactivated successfully. Oct 14 06:19:28 localhost podman[336425]: 2025-10-14 10:19:28.751738591 +0000 UTC m=+0.093010432 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:19:28 localhost podman[336425]: 2025-10-14 10:19:28.759906354 +0000 UTC m=+0.101178265 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:19:28 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:19:28 localhost podman[336426]: 2025-10-14 10:19:28.845799715 +0000 UTC m=+0.183801345 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent) Oct 14 06:19:28 localhost podman[336426]: 2025-10-14 10:19:28.853036019 +0000 UTC m=+0.191037589 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 06:19:28 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:19:28 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:28.955 2 INFO neutron.agent.securitygroups_rpc [None req-c2f9c371-9d2a-4b37-a033-6edbdd512e3d bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:29 localhost nova_compute[297686]: 2025-10-14 10:19:29.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:30 localhost nova_compute[297686]: 2025-10-14 10:19:30.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:30 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:30.979 2 INFO neutron.agent.securitygroups_rpc [None req-e08f7c5d-e6e9-4662-9389-773b37e0faaa bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.692 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.692 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.693 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.751 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.752 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.753 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:19:31 localhost nova_compute[297686]: 2025-10-14 10:19:31.753 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:19:32 localhost nova_compute[297686]: 2025-10-14 10:19:32.252 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:19:32 localhost nova_compute[297686]: 2025-10-14 10:19:32.267 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:19:32 localhost nova_compute[297686]: 2025-10-14 10:19:32.267 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:19:32 localhost nova_compute[297686]: 2025-10-14 10:19:32.268 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:32 localhost nova_compute[297686]: 2025-10-14 10:19:32.269 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:32 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e190 e190: 6 total, 6 up, 6 in Oct 14 06:19:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:19:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:19:33 localhost nova_compute[297686]: 2025-10-14 10:19:33.828 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:34 localhost nova_compute[297686]: 2025-10-14 10:19:34.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:34 localhost nova_compute[297686]: 2025-10-14 10:19:34.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:19:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:19:34 localhost nova_compute[297686]: 2025-10-14 10:19:34.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e191 e191: 6 total, 6 up, 6 in Oct 14 06:19:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:19:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:19:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:19:35 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:35.240 2 INFO neutron.agent.securitygroups_rpc [None req-2474946b-4b63-4e01-8a5d-90b033b7cb5e bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['aef71ad5-f79f-4ece-9506-eb534e5871f7']#033[00m Oct 14 06:19:35 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:35.299 271987 INFO neutron.agent.linux.ip_lib [None req-4711a3ee-3ec9-4607-bfdd-2d75b93de988 - - - - - -] Device tap62dc069b-a0 cannot be used as it has no MAC address#033[00m Oct 14 06:19:35 localhost podman[336559]: 2025-10-14 10:19:35.317977946 +0000 UTC m=+0.091079722 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost kernel: device tap62dc069b-a0 entered promiscuous mode Oct 14 06:19:35 localhost NetworkManager[5977]: [1760437175.3355] manager: (tap62dc069b-a0): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Oct 14 06:19:35 localhost ovn_controller[157396]: 2025-10-14T10:19:35Z|00280|binding|INFO|Claiming lport 62dc069b-a018-41ee-acc3-ee816b018cdf for this chassis. Oct 14 06:19:35 localhost ovn_controller[157396]: 2025-10-14T10:19:35Z|00281|binding|INFO|62dc069b-a018-41ee-acc3-ee816b018cdf: Claiming unknown Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost systemd-udevd[336607]: Network interface NamePolicy= disabled on kernel command line. Oct 14 06:19:35 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:35.346 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=857fac98-c81f-4d46-906d-419db1a00d28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=62dc069b-a018-41ee-acc3-ee816b018cdf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:35 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:35.348 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 62dc069b-a018-41ee-acc3-ee816b018cdf in datapath ac6dfafe-fe19-467d-95fd-e237a973e4b9 bound to our chassis#033[00m Oct 14 06:19:35 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:35.350 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 24d7d966-47e9-4ef3-902a-e528c6493ee1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:19:35 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:35.351 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac6dfafe-fe19-467d-95fd-e237a973e4b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:19:35 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:35.351 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[41ce9a34-33a1-4bf9-b8a9-6f2899bd84b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:35 localhost podman[336559]: 2025-10-14 10:19:35.359043538 +0000 UTC m=+0.132145384 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Oct 14 06:19:35 localhost systemd[1]: tmp-crun.z7JBNC.mount: Deactivated successfully. Oct 14 06:19:35 localhost podman[336557]: 2025-10-14 10:19:35.37748895 +0000 UTC m=+0.157109247 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost ovn_controller[157396]: 2025-10-14T10:19:35Z|00282|binding|INFO|Setting lport 62dc069b-a018-41ee-acc3-ee816b018cdf ovn-installed in OVS Oct 14 06:19:35 localhost podman[336557]: 2025-10-14 10:19:35.388905303 +0000 UTC m=+0.168525580 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:19:35 localhost ovn_controller[157396]: 2025-10-14T10:19:35Z|00283|binding|INFO|Setting lport 62dc069b-a018-41ee-acc3-ee816b018cdf up in Southbound Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost nova_compute[297686]: 2025-10-14 10:19:35.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:35 localhost podman[336556]: 2025-10-14 10:19:35.391988959 +0000 UTC m=+0.172595077 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:19:35 localhost podman[336556]: 2025-10-14 10:19:35.475209936 +0000 UTC m=+0.255816134 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:19:35 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:19:36 localhost podman[336677]: Oct 14 06:19:36 localhost podman[336677]: 2025-10-14 10:19:36.240758258 +0000 UTC m=+0.088765611 container create 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Oct 14 06:19:36 localhost systemd[1]: Started libpod-conmon-27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6.scope. Oct 14 06:19:36 localhost podman[336677]: 2025-10-14 10:19:36.19822182 +0000 UTC m=+0.046229213 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:36 localhost systemd[1]: Started libcrun container. Oct 14 06:19:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b7c996a5afa2c891700c5c80c0349e9a0c45b0ccbd8a62b38228ec51031de20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:36 localhost podman[336677]: 2025-10-14 10:19:36.328117783 +0000 UTC m=+0.176125116 container init 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:19:36 localhost podman[336677]: 2025-10-14 10:19:36.336172443 +0000 UTC m=+0.184179786 container start 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:19:36 localhost dnsmasq[336695]: started, version 2.85 cachesize 150 Oct 14 06:19:36 localhost dnsmasq[336695]: DNS service limited to local subnets Oct 14 06:19:36 localhost dnsmasq[336695]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:36 localhost dnsmasq[336695]: warning: no upstream servers configured Oct 14 06:19:36 localhost dnsmasq-dhcp[336695]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:19:36 localhost dnsmasq[336695]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 0 addresses Oct 14 06:19:36 localhost dnsmasq-dhcp[336695]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:36 localhost dnsmasq-dhcp[336695]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:36 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:36.580 271987 INFO neutron.agent.dhcp.agent [None req-d8050d7a-66c4-4168-b855-2ee4b99400e4 - - - - - -] DHCP configuration for ports {'508c587a-3455-46b0-9a18-33b955cfbeef', '3c920156-fad7-49d0-b0fe-2d7f43047b0e'} is completed#033[00m Oct 14 06:19:36 localhost dnsmasq[336695]: exiting on receipt of SIGTERM Oct 14 06:19:36 localhost podman[336713]: 2025-10-14 10:19:36.656169894 +0000 UTC m=+0.049044000 container kill 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:19:36 localhost systemd[1]: libpod-27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6.scope: Deactivated successfully. Oct 14 06:19:36 localhost podman[336726]: 2025-10-14 10:19:36.718586127 +0000 UTC m=+0.049973149 container died 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:19:36 localhost podman[336726]: 2025-10-14 10:19:36.748260536 +0000 UTC m=+0.079647518 container cleanup 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:19:36 localhost systemd[1]: libpod-conmon-27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6.scope: Deactivated successfully. Oct 14 06:19:36 localhost podman[336733]: 2025-10-14 10:19:36.783097155 +0000 UTC m=+0.105677903 container remove 27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:19:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:36.922 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:35:97 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=857fac98-c81f-4d46-906d-419db1a00d28, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3c920156-fad7-49d0-b0fe-2d7f43047b0e) old=Port_Binding(mac=['fa:16:3e:f2:35:97 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:36.924 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3c920156-fad7-49d0-b0fe-2d7f43047b0e in datapath ac6dfafe-fe19-467d-95fd-e237a973e4b9 updated#033[00m Oct 14 06:19:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:36.928 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 24d7d966-47e9-4ef3-902a-e528c6493ee1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:19:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:36.928 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac6dfafe-fe19-467d-95fd-e237a973e4b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:19:36 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:36.929 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[cd541a1a-015f-411d-9027-6d981c165834]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:37 localhost systemd[1]: var-lib-containers-storage-overlay-5b7c996a5afa2c891700c5c80c0349e9a0c45b0ccbd8a62b38228ec51031de20-merged.mount: Deactivated successfully. Oct 14 06:19:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27c87cadd07cb1f05939781c826e32a3eab02b94d0d1f3b0c977110aecf6f3c6-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.286 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.286 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.286 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.286 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.287 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:19:37 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:37.466 2 INFO neutron.agent.securitygroups_rpc [None req-64d9ba1f-f4f6-4f94-b760-17fb755a2c7e bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['aef71ad5-f79f-4ece-9506-eb534e5871f7', '4125c890-e798-4a40-8152-43496831500b']#033[00m Oct 14 06:19:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:19:37 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3763975399' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.710 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.779 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.780 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:19:37 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:37.789 2 INFO neutron.agent.securitygroups_rpc [None req-5640d12e-d220-4be6-9c7b-bac66bab8d2c bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['4125c890-e798-4a40-8152-43496831500b']#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.963 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.964 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11215MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.965 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:19:37 localhost nova_compute[297686]: 2025-10-14 10:19:37.965 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:19:38 localhost podman[336828]: Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.150 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.151 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:19:38 localhost podman[336828]: 2025-10-14 10:19:38.151921831 +0000 UTC m=+0.083593860 container create 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.152 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:19:38 localhost systemd[1]: Started libpod-conmon-14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7.scope. Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.207 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:19:38 localhost systemd[1]: Started libcrun container. Oct 14 06:19:38 localhost podman[336828]: 2025-10-14 10:19:38.113154481 +0000 UTC m=+0.044826590 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cd7124ef0c86dc6625052a4dd82781cf5ca9b0ba1c2a889237e4875d92637b4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:38 localhost podman[336828]: 2025-10-14 10:19:38.224206371 +0000 UTC m=+0.155878400 container init 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3) Oct 14 06:19:38 localhost podman[336828]: 2025-10-14 10:19:38.233151477 +0000 UTC m=+0.164823486 container start 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:19:38 localhost dnsmasq[336848]: started, version 2.85 cachesize 150 Oct 14 06:19:38 localhost dnsmasq[336848]: DNS service limited to local subnets Oct 14 06:19:38 localhost dnsmasq[336848]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:38 localhost dnsmasq[336848]: warning: no upstream servers configured Oct 14 06:19:38 localhost dnsmasq-dhcp[336848]: DHCP, static leases only on 10.100.0.16, lease time 1d Oct 14 06:19:38 localhost dnsmasq-dhcp[336848]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:19:38 localhost dnsmasq[336848]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 1 addresses Oct 14 06:19:38 localhost dnsmasq-dhcp[336848]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:38 localhost dnsmasq-dhcp[336848]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:38.295 271987 INFO neutron.agent.dhcp.agent [None req-08947a0b-b909-4cff-82fa-f71c1f716c4a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:19:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=20573bb3-badd-4208-bf0f-fe7abeb7b34f, ip_allocation=immediate, mac_address=fa:16:3e:90:f9:3b, name=tempest-PortsTestJSON-1908830147, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:19:02Z, description=, dns_domain=, id=ac6dfafe-fe19-467d-95fd-e237a973e4b9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-107039004, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25150, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2791, status=ACTIVE, subnets=['2d741058-f669-40c1-aef1-31c71474557c'], tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:33Z, vlan_transparent=None, network_id=ac6dfafe-fe19-467d-95fd-e237a973e4b9, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['aef71ad5-f79f-4ece-9506-eb534e5871f7'], standard_attr_id=2978, status=DOWN, tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:35Z on network ac6dfafe-fe19-467d-95fd-e237a973e4b9#033[00m Oct 14 06:19:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:38.517 271987 INFO neutron.agent.dhcp.agent [None req-c26060de-6bf6-42af-b1e1-f32c7a42d087 - - - - - -] DHCP configuration for ports {'508c587a-3455-46b0-9a18-33b955cfbeef', '20573bb3-badd-4208-bf0f-fe7abeb7b34f', '62dc069b-a018-41ee-acc3-ee816b018cdf', '3c920156-fad7-49d0-b0fe-2d7f43047b0e'} is completed#033[00m Oct 14 06:19:38 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:38 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:38 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:38 localhost dnsmasq[336848]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 1 addresses Oct 14 06:19:38 localhost dnsmasq-dhcp[336848]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:38 localhost dnsmasq-dhcp[336848]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:38 localhost podman[336885]: 2025-10-14 10:19:38.564226882 +0000 UTC m=+0.060742143 container kill 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:19:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:19:38 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4227843787' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.666 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.670 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.688 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.707 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:19:38 localhost nova_compute[297686]: 2025-10-14 10:19:38.707 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:19:38 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:38.733 271987 INFO neutron.agent.dhcp.agent [None req-e0c36d91-e07b-4a12-a0b6-9070f74451a0 - - - - - -] DHCP configuration for ports {'20573bb3-badd-4208-bf0f-fe7abeb7b34f'} is completed#033[00m Oct 14 06:19:38 localhost openstack_network_exporter[250374]: ERROR 10:19:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:19:38 localhost openstack_network_exporter[250374]: ERROR 10:19:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:19:38 localhost openstack_network_exporter[250374]: ERROR 10:19:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:19:38 localhost openstack_network_exporter[250374]: ERROR 10:19:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:19:38 localhost openstack_network_exporter[250374]: Oct 14 06:19:38 localhost openstack_network_exporter[250374]: ERROR 10:19:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:19:38 localhost openstack_network_exporter[250374]: Oct 14 06:19:38 localhost dnsmasq[336848]: exiting on receipt of SIGTERM Oct 14 06:19:38 localhost podman[336924]: 2025-10-14 10:19:38.928931957 +0000 UTC m=+0.051742313 container kill 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:38 localhost systemd[1]: libpod-14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7.scope: Deactivated successfully. Oct 14 06:19:38 localhost podman[336937]: 2025-10-14 10:19:38.999768732 +0000 UTC m=+0.053995484 container died 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:19:39 localhost podman[336937]: 2025-10-14 10:19:39.031916197 +0000 UTC m=+0.086142909 container cleanup 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:19:39 localhost systemd[1]: libpod-conmon-14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7.scope: Deactivated successfully. Oct 14 06:19:39 localhost podman[336938]: 2025-10-14 10:19:39.082831504 +0000 UTC m=+0.133167015 container remove 14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true) Oct 14 06:19:39 localhost systemd[1]: var-lib-containers-storage-overlay-3cd7124ef0c86dc6625052a4dd82781cf5ca9b0ba1c2a889237e4875d92637b4-merged.mount: Deactivated successfully. Oct 14 06:19:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14c93e5ab2289ad53e5a3dddc224903e65ce44d78e241073187792df6c0645f7-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:39 localhost nova_compute[297686]: 2025-10-14 10:19:39.707 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:19:39 localhost nova_compute[297686]: 2025-10-14 10:19:39.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:39 localhost podman[337016]: Oct 14 06:19:39 localhost podman[337016]: 2025-10-14 10:19:39.961325414 +0000 UTC m=+0.124692363 container create 29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS) Oct 14 06:19:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e192 e192: 6 total, 6 up, 6 in Oct 14 06:19:40 localhost systemd[1]: Started libpod-conmon-29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783.scope. Oct 14 06:19:40 localhost podman[337016]: 2025-10-14 10:19:39.919372965 +0000 UTC m=+0.082739844 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:40 localhost systemd[1]: Started libcrun container. Oct 14 06:19:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e374079a7ff21b92df3cbc0d52622003af620634d77f1306ac8f6d77a7509436/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:40 localhost podman[337016]: 2025-10-14 10:19:40.041457325 +0000 UTC m=+0.204824224 container init 29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:19:40 localhost podman[337016]: 2025-10-14 10:19:40.050604319 +0000 UTC m=+0.213971228 container start 29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009) Oct 14 06:19:40 localhost dnsmasq[337035]: started, version 2.85 cachesize 150 Oct 14 06:19:40 localhost dnsmasq[337035]: DNS service limited to local subnets Oct 14 06:19:40 localhost dnsmasq[337035]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:40 localhost dnsmasq[337035]: warning: no upstream servers configured Oct 14 06:19:40 localhost dnsmasq-dhcp[337035]: DHCP, static leases only on 10.100.0.16, lease time 1d Oct 14 06:19:40 localhost dnsmasq[337035]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 0 addresses Oct 14 06:19:40 localhost dnsmasq-dhcp[337035]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:40 localhost dnsmasq-dhcp[337035]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:40 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:40.322 271987 INFO neutron.agent.dhcp.agent [None req-cda71006-4334-44f1-a006-80bc5a15e2fe - - - - - -] DHCP configuration for ports {'62dc069b-a018-41ee-acc3-ee816b018cdf', '3c920156-fad7-49d0-b0fe-2d7f43047b0e', '508c587a-3455-46b0-9a18-33b955cfbeef'} is completed#033[00m Oct 14 06:19:40 localhost nova_compute[297686]: 2025-10-14 10:19:40.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:40 localhost dnsmasq[337035]: exiting on receipt of SIGTERM Oct 14 06:19:40 localhost podman[337053]: 2025-10-14 10:19:40.388562646 +0000 UTC m=+0.063259049 container kill 29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:19:40 localhost systemd[1]: libpod-29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783.scope: Deactivated successfully. Oct 14 06:19:40 localhost podman[337067]: 2025-10-14 10:19:40.444800078 +0000 UTC m=+0.037903734 container died 29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:40 localhost systemd[1]: tmp-crun.65zQwC.mount: Deactivated successfully. Oct 14 06:19:40 localhost podman[337067]: 2025-10-14 10:19:40.546142648 +0000 UTC m=+0.139246314 container remove 29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:40 localhost systemd[1]: libpod-conmon-29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783.scope: Deactivated successfully. Oct 14 06:19:40 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:40.846 2 INFO neutron.agent.securitygroups_rpc [None req-07371dd3-5e9f-44d8-b05e-23650af26a99 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['20ea9413-8d37-4d17-a266-e02ebbcd4097']#033[00m Oct 14 06:19:41 localhost systemd[1]: var-lib-containers-storage-overlay-e374079a7ff21b92df3cbc0d52622003af620634d77f1306ac8f6d77a7509436-merged.mount: Deactivated successfully. Oct 14 06:19:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29a4410411cb7e04cd8917eb4371562950ec484ce73b5e4eee67ee867daa4783-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:41 localhost podman[337144]: Oct 14 06:19:41 localhost podman[337144]: 2025-10-14 10:19:41.821056406 +0000 UTC m=+0.080956129 container create 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:19:41 localhost systemd[1]: Started libpod-conmon-3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59.scope. Oct 14 06:19:41 localhost systemd[1]: Started libcrun container. Oct 14 06:19:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b916e61bf07e1dda854833e02350ddcc68b28d0c502932428543b421d1d5e1b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:41 localhost podman[337144]: 2025-10-14 10:19:41.779485368 +0000 UTC m=+0.039385171 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:41 localhost podman[337144]: 2025-10-14 10:19:41.884212501 +0000 UTC m=+0.144112224 container init 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 06:19:41 localhost podman[337144]: 2025-10-14 10:19:41.894773809 +0000 UTC m=+0.154673542 container start 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009) Oct 14 06:19:41 localhost dnsmasq[337162]: started, version 2.85 cachesize 150 Oct 14 06:19:41 localhost dnsmasq[337162]: DNS service limited to local subnets Oct 14 06:19:41 localhost dnsmasq[337162]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:41 localhost dnsmasq[337162]: warning: no upstream servers configured Oct 14 06:19:41 localhost dnsmasq-dhcp[337162]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:19:41 localhost dnsmasq-dhcp[337162]: DHCP, static leases only on 10.100.0.16, lease time 1d Oct 14 06:19:41 localhost dnsmasq[337162]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 0 addresses Oct 14 06:19:41 localhost dnsmasq-dhcp[337162]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:41 localhost dnsmasq-dhcp[337162]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:42 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:42.205 271987 INFO neutron.agent.dhcp.agent [None req-08247260-bf94-4a0f-b7cb-048ef981c550 - - - - - -] DHCP configuration for ports {'508c587a-3455-46b0-9a18-33b955cfbeef', '62dc069b-a018-41ee-acc3-ee816b018cdf', '3c920156-fad7-49d0-b0fe-2d7f43047b0e'} is completed#033[00m Oct 14 06:19:42 localhost dnsmasq[337162]: exiting on receipt of SIGTERM Oct 14 06:19:42 localhost podman[337180]: 2025-10-14 10:19:42.244130469 +0000 UTC m=+0.061769294 container kill 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Oct 14 06:19:42 localhost systemd[1]: tmp-crun.SmINC1.mount: Deactivated successfully. Oct 14 06:19:42 localhost systemd[1]: libpod-3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59.scope: Deactivated successfully. Oct 14 06:19:42 localhost podman[337194]: 2025-10-14 10:19:42.312976931 +0000 UTC m=+0.054220060 container died 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:19:42 localhost podman[337194]: 2025-10-14 10:19:42.339342268 +0000 UTC m=+0.080585337 container cleanup 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:42 localhost systemd[1]: libpod-conmon-3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59.scope: Deactivated successfully. Oct 14 06:19:42 localhost podman[337196]: 2025-10-14 10:19:42.400904015 +0000 UTC m=+0.134799586 container remove 3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:19:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:42.512 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:35:97 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=857fac98-c81f-4d46-906d-419db1a00d28, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3c920156-fad7-49d0-b0fe-2d7f43047b0e) old=Port_Binding(mac=['fa:16:3e:f2:35:97 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:42.516 163055 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3c920156-fad7-49d0-b0fe-2d7f43047b0e in datapath ac6dfafe-fe19-467d-95fd-e237a973e4b9 updated#033[00m Oct 14 06:19:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:42.518 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Port 24d7d966-47e9-4ef3-902a-e528c6493ee1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Oct 14 06:19:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:42.519 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac6dfafe-fe19-467d-95fd-e237a973e4b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:19:42 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:42.520 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[d09fe2db-5851-4392-bb5f-83e25dd96e09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:42 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:42.889 2 INFO neutron.agent.securitygroups_rpc [None req-55447faf-c0cc-43a9-8209-8c9d04c1af5b bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['20ea9413-8d37-4d17-a266-e02ebbcd4097', '78dc3557-4401-4e68-b797-8af5d01655e7', 'f475abc8-1ecb-46b5-aee7-314e00187e8e']#033[00m Oct 14 06:19:43 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:43.180 2 INFO neutron.agent.securitygroups_rpc [None req-43b9eac2-1b77-4acb-89ed-53198092cfa5 bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['78dc3557-4401-4e68-b797-8af5d01655e7', 'f475abc8-1ecb-46b5-aee7-314e00187e8e']#033[00m Oct 14 06:19:43 localhost systemd[1]: var-lib-containers-storage-overlay-b916e61bf07e1dda854833e02350ddcc68b28d0c502932428543b421d1d5e1b9-merged.mount: Deactivated successfully. Oct 14 06:19:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3004032bc4ef5dc018ef63bd77996f12dbb83b422a3000be150d40606ce12a59-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:43 localhost podman[337273]: Oct 14 06:19:43 localhost podman[337273]: 2025-10-14 10:19:43.634788972 +0000 UTC m=+0.081899058 container create 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS) Oct 14 06:19:43 localhost systemd[1]: Started libpod-conmon-738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67.scope. Oct 14 06:19:43 localhost podman[337273]: 2025-10-14 10:19:43.595475694 +0000 UTC m=+0.042585790 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:43 localhost systemd[1]: Started libcrun container. Oct 14 06:19:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7de49d3172f2e06271793a07bd2591bfa51f12fc697ba96c71597ca4274dffef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:43 localhost podman[337273]: 2025-10-14 10:19:43.711999304 +0000 UTC m=+0.159109390 container init 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:43 localhost podman[337273]: 2025-10-14 10:19:43.722745657 +0000 UTC m=+0.169855743 container start 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:19:43 localhost dnsmasq[337291]: started, version 2.85 cachesize 150 Oct 14 06:19:43 localhost dnsmasq[337291]: DNS service limited to local subnets Oct 14 06:19:43 localhost dnsmasq[337291]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:43 localhost dnsmasq[337291]: warning: no upstream servers configured Oct 14 06:19:43 localhost dnsmasq-dhcp[337291]: DHCP, static leases only on 10.100.0.0, lease time 1d Oct 14 06:19:43 localhost dnsmasq-dhcp[337291]: DHCP, static leases only on 10.100.0.16, lease time 1d Oct 14 06:19:43 localhost dnsmasq-dhcp[337291]: DHCP, static leases only on 10.100.0.32, lease time 1d Oct 14 06:19:43 localhost dnsmasq[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 1 addresses Oct 14 06:19:43 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:43 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:43 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:43.793 271987 INFO neutron.agent.dhcp.agent [None req-58dcb8ed-5056-4d57-a12a-3d909f869bd8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:19:40Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d4af6af1-01f5-470e-b065-cc33e75c04e8, ip_allocation=immediate, mac_address=fa:16:3e:ef:a5:42, name=tempest-PortsTestJSON-1891808519, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:19:02Z, description=, dns_domain=, id=ac6dfafe-fe19-467d-95fd-e237a973e4b9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-107039004, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25150, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=2791, status=ACTIVE, subnets=['a9a59b4d-54ba-4c96-a77c-fdea507969d8', 'fc709217-64f1-4c9e-a14e-8b57b529e28d'], tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:38Z, vlan_transparent=None, network_id=ac6dfafe-fe19-467d-95fd-e237a973e4b9, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['20ea9413-8d37-4d17-a266-e02ebbcd4097'], standard_attr_id=3010, status=DOWN, tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:40Z on network ac6dfafe-fe19-467d-95fd-e237a973e4b9#033[00m Oct 14 06:19:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:44.044 271987 INFO neutron.agent.dhcp.agent [None req-f94410a0-22e9-4f63-b0b2-aefb6facdfa9 - - - - - -] DHCP configuration for ports {'d4af6af1-01f5-470e-b065-cc33e75c04e8', '508c587a-3455-46b0-9a18-33b955cfbeef', '3c920156-fad7-49d0-b0fe-2d7f43047b0e', '62dc069b-a018-41ee-acc3-ee816b018cdf'} is completed#033[00m Oct 14 06:19:44 localhost dnsmasq[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 1 addresses Oct 14 06:19:44 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:44 localhost podman[337309]: 2025-10-14 10:19:44.177520102 +0000 UTC m=+0.054297152 container kill 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:19:44 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:44.343 271987 INFO neutron.agent.dhcp.agent [None req-4fbfd83d-47c8-4beb-b217-fede53edca4e - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:19:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=20573bb3-badd-4208-bf0f-fe7abeb7b34f, ip_allocation=immediate, mac_address=fa:16:3e:90:f9:3b, name=tempest-PortsTestJSON-864937480, network_id=ac6dfafe-fe19-467d-95fd-e237a973e4b9, port_security_enabled=True, project_id=2cffabfb0ecf4b5d91a7a63dd17a370a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['4125c890-e798-4a40-8152-43496831500b'], standard_attr_id=2978, status=DOWN, tags=[], tenant_id=2cffabfb0ecf4b5d91a7a63dd17a370a, updated_at=2025-10-14T10:19:37Z on network ac6dfafe-fe19-467d-95fd-e237a973e4b9#033[00m Oct 14 06:19:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:44.394 271987 INFO neutron.agent.dhcp.agent [None req-242d8ccc-994d-47b1-853b-a58eefa67c09 - - - - - -] DHCP configuration for ports {'d4af6af1-01f5-470e-b065-cc33e75c04e8'} is completed#033[00m Oct 14 06:19:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:44 localhost dnsmasq[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 2 addresses Oct 14 06:19:44 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:44 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:44 localhost podman[337346]: 2025-10-14 10:19:44.665027692 +0000 UTC m=+0.063418226 container kill 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:19:44 localhost nova_compute[297686]: 2025-10-14 10:19:44.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:44.936 271987 INFO neutron.agent.dhcp.agent [None req-33378b2a-dca9-4849-939b-db9289aa67c8 - - - - - -] DHCP configuration for ports {'20573bb3-badd-4208-bf0f-fe7abeb7b34f'} is completed#033[00m Oct 14 06:19:45 localhost dnsmasq[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 1 addresses Oct 14 06:19:45 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:45 localhost podman[337384]: 2025-10-14 10:19:45.162703677 +0000 UTC m=+0.060213397 container kill 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:19:45 localhost dnsmasq-dhcp[337291]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:45 localhost systemd[1]: tmp-crun.SlZX7f.mount: Deactivated successfully. Oct 14 06:19:45 localhost nova_compute[297686]: 2025-10-14 10:19:45.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:45 localhost podman[337419]: 2025-10-14 10:19:45.731510354 +0000 UTC m=+0.071243428 container kill 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true) Oct 14 06:19:45 localhost dnsmasq[337291]: exiting on receipt of SIGTERM Oct 14 06:19:45 localhost systemd[1]: libpod-738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67.scope: Deactivated successfully. Oct 14 06:19:45 localhost podman[337433]: 2025-10-14 10:19:45.80758074 +0000 UTC m=+0.063289351 container died 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:45 localhost podman[337433]: 2025-10-14 10:19:45.843157851 +0000 UTC m=+0.098866432 container cleanup 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Oct 14 06:19:45 localhost systemd[1]: libpod-conmon-738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67.scope: Deactivated successfully. Oct 14 06:19:45 localhost podman[337440]: 2025-10-14 10:19:45.887103653 +0000 UTC m=+0.129588545 container remove 738611d49f53902dbc2ccdbaf2ab170e184082ab306fdf5e108b679fd86b2c67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:19:46 localhost neutron_sriov_agent[264974]: 2025-10-14 10:19:46.319 2 INFO neutron.agent.securitygroups_rpc [None req-33c7cc6c-c9a0-4ce3-9ef1-1e7086da595b bbbcd088abe94518b01a8b1085998690 2cffabfb0ecf4b5d91a7a63dd17a370a - - default default] Security group member updated ['29ffe3b6-a0bd-4faa-ab66-a0c74c1b5533']#033[00m Oct 14 06:19:46 localhost systemd[1]: var-lib-containers-storage-overlay-7de49d3172f2e06271793a07bd2591bfa51f12fc697ba96c71597ca4274dffef-merged.mount: Deactivated successfully. Oct 14 06:19:46 localhost ovn_controller[157396]: 2025-10-14T10:19:46Z|00284|binding|INFO|Removing iface tap62dc069b-a0 ovn-installed in OVS Oct 14 06:19:46 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:46.755 163055 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 24d7d966-47e9-4ef3-902a-e528c6493ee1 with type ""#033[00m Oct 14 06:19:46 localhost ovn_controller[157396]: 2025-10-14T10:19:46Z|00285|binding|INFO|Removing lport 62dc069b-a018-41ee-acc3-ee816b018cdf ovn-installed in OVS Oct 14 06:19:46 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:46.756 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486733.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcpe1893814-19d7-5b97-af05-19e2aafa5382-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac6dfafe-fe19-467d-95fd-e237a973e4b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cffabfb0ecf4b5d91a7a63dd17a370a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486733.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=857fac98-c81f-4d46-906d-419db1a00d28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=62dc069b-a018-41ee-acc3-ee816b018cdf) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:19:46 localhost nova_compute[297686]: 2025-10-14 10:19:46.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:46 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:46.760 163055 INFO neutron.agent.ovn.metadata.agent [-] Port 62dc069b-a018-41ee-acc3-ee816b018cdf in datapath ac6dfafe-fe19-467d-95fd-e237a973e4b9 unbound from our chassis#033[00m Oct 14 06:19:46 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:46.762 163055 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac6dfafe-fe19-467d-95fd-e237a973e4b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Oct 14 06:19:46 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:46.763 163159 DEBUG oslo.privsep.daemon [-] privsep: reply[a96b52ee-d821-450b-a2fe-eece9642167b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Oct 14 06:19:46 localhost nova_compute[297686]: 2025-10-14 10:19:46.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:46 localhost podman[337515]: Oct 14 06:19:46 localhost podman[337515]: 2025-10-14 10:19:46.97378374 +0000 UTC m=+0.104880719 container create d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:19:47 localhost systemd[1]: Started libpod-conmon-d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29.scope. Oct 14 06:19:47 localhost podman[337515]: 2025-10-14 10:19:46.924424591 +0000 UTC m=+0.055521620 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Oct 14 06:19:47 localhost systemd[1]: Started libcrun container. Oct 14 06:19:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be87e225933a0e249006119e1c58a867b268a74f0154267e4b9a40edffb2a6cd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Oct 14 06:19:47 localhost ovn_controller[157396]: 2025-10-14T10:19:47Z|00286|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:19:47 localhost podman[337515]: 2025-10-14 10:19:47.039819526 +0000 UTC m=+0.170916515 container init d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:19:47 localhost podman[337515]: 2025-10-14 10:19:47.048314869 +0000 UTC m=+0.179411858 container start d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:47 localhost dnsmasq[337533]: started, version 2.85 cachesize 150 Oct 14 06:19:47 localhost dnsmasq[337533]: DNS service limited to local subnets Oct 14 06:19:47 localhost dnsmasq[337533]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Oct 14 06:19:47 localhost dnsmasq[337533]: warning: no upstream servers configured Oct 14 06:19:47 localhost dnsmasq-dhcp[337533]: DHCP, static leases only on 10.100.0.16, lease time 1d Oct 14 06:19:47 localhost dnsmasq[337533]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/addn_hosts - 0 addresses Oct 14 06:19:47 localhost dnsmasq-dhcp[337533]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/host Oct 14 06:19:47 localhost dnsmasq-dhcp[337533]: read /var/lib/neutron/dhcp/ac6dfafe-fe19-467d-95fd-e237a973e4b9/opts Oct 14 06:19:47 localhost nova_compute[297686]: 2025-10-14 10:19:47.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:47.181 271987 INFO neutron.agent.dhcp.agent [None req-d5a36555-db6b-4cb3-aa4f-5e41d676502e - - - - - -] DHCP configuration for ports {'508c587a-3455-46b0-9a18-33b955cfbeef', '3c920156-fad7-49d0-b0fe-2d7f43047b0e', '62dc069b-a018-41ee-acc3-ee816b018cdf'} is completed#033[00m Oct 14 06:19:47 localhost dnsmasq[337533]: exiting on receipt of SIGTERM Oct 14 06:19:47 localhost podman[337551]: 2025-10-14 10:19:47.280610804 +0000 UTC m=+0.060957319 container kill d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:19:47 localhost systemd[1]: libpod-d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29.scope: Deactivated successfully. Oct 14 06:19:47 localhost podman[337565]: 2025-10-14 10:19:47.353411339 +0000 UTC m=+0.057953046 container died d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009) Oct 14 06:19:47 localhost podman[337565]: 2025-10-14 10:19:47.382289623 +0000 UTC m=+0.086831290 container cleanup d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:19:47 localhost systemd[1]: libpod-conmon-d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29.scope: Deactivated successfully. Oct 14 06:19:47 localhost podman[337568]: 2025-10-14 10:19:47.483266711 +0000 UTC m=+0.175707244 container remove d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac6dfafe-fe19-467d-95fd-e237a973e4b9, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Oct 14 06:19:47 localhost nova_compute[297686]: 2025-10-14 10:19:47.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:47 localhost kernel: device tap62dc069b-a0 left promiscuous mode Oct 14 06:19:47 localhost nova_compute[297686]: 2025-10-14 10:19:47.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:47.611 271987 INFO neutron.agent.dhcp.agent [None req-9ce2e34a-8fb8-4227-bd91-4dc1d71c5238 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:47.613 271987 INFO neutron.agent.dhcp.agent [None req-9ce2e34a-8fb8-4227-bd91-4dc1d71c5238 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:47.614 271987 INFO neutron.agent.dhcp.agent [None req-9ce2e34a-8fb8-4227-bd91-4dc1d71c5238 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:47 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:19:47.614 271987 INFO neutron.agent.dhcp.agent [None req-9ce2e34a-8fb8-4227-bd91-4dc1d71c5238 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Oct 14 06:19:47 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:47 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:47 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1069349725", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce754065-58e4-49a9-a603-4440ef4b311b/064df9c5-e20e-4319-95d4-91217c164f07", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce754065-58e4-49a9-a603-4440ef4b311b", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:47 localhost systemd[1]: var-lib-containers-storage-overlay-be87e225933a0e249006119e1c58a867b268a74f0154267e4b9a40edffb2a6cd-merged.mount: Deactivated successfully. Oct 14 06:19:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d64ee64b5963018f9889d76a58ef95db4673f09c0208e26e048561249d00fc29-userdata-shm.mount: Deactivated successfully. Oct 14 06:19:47 localhost systemd[1]: run-netns-qdhcp\x2dac6dfafe\x2dfe19\x2d467d\x2d95fd\x2de237a973e4b9.mount: Deactivated successfully. Oct 14 06:19:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1208771827' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1208771827' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:19:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:19:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:19:49 localhost podman[337592]: 2025-10-14 10:19:49.760318688 +0000 UTC m=+0.097525572 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251009, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:19:49 localhost systemd[1]: tmp-crun.8DNdpo.mount: Deactivated successfully. Oct 14 06:19:49 localhost podman[337593]: 2025-10-14 10:19:49.88114823 +0000 UTC m=+0.214662760 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350) Oct 14 06:19:49 localhost podman[337592]: 2025-10-14 10:19:49.893145182 +0000 UTC m=+0.230352096 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:19:49 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:19:49 localhost nova_compute[297686]: 2025-10-14 10:19:49.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:49 localhost podman[337593]: 2025-10-14 10:19:49.944806132 +0000 UTC m=+0.278320702 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter) Oct 14 06:19:49 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:19:50 localhost podman[337594]: 2025-10-14 10:19:49.846379783 +0000 UTC m=+0.175607240 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible) Oct 14 06:19:50 localhost podman[337594]: 2025-10-14 10:19:50.031268879 +0000 UTC m=+0.360496336 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:19:50 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:19:50 localhost nova_compute[297686]: 2025-10-14 10:19:50.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:51 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1069349725", "format": "json"} : dispatch Oct 14 06:19:51 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"} : dispatch Oct 14 06:19:51 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1069349725"}]': finished Oct 14 06:19:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:54 localhost nova_compute[297686]: 2025-10-14 10:19:54.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:55 localhost nova_compute[297686]: 2025-10-14 10:19:55.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:19:56 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-134128418", "format": "json"} : dispatch Oct 14 06:19:56 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-134128418", "caps": ["mds", "allow rw path=/volumes/_nogroup/7464b515-4b76-441d-b135-e760267664f0/d006bc5a-0512-431c-9db5-8c6c5ce619a9", "osd", "allow rw pool=manila_data namespace=fsvolumens_7464b515-4b76-441d-b135-e760267664f0", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:19:56 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-134128418", "caps": ["mds", "allow rw path=/volumes/_nogroup/7464b515-4b76-441d-b135-e760267664f0/d006bc5a-0512-431c-9db5-8c6c5ce619a9", "osd", "allow rw pool=manila_data namespace=fsvolumens_7464b515-4b76-441d-b135-e760267664f0", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:19:56 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Oct 14 06:19:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-134128418", "format": "json"} : dispatch Oct 14 06:19:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-134128418"} : dispatch Oct 14 06:19:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-134128418"}]': finished Oct 14 06:19:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:57.787 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:19:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:57.787 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:19:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:19:57.789 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:19:58 localhost podman[248187]: time="2025-10-14T10:19:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:19:58 localhost podman[248187]: @ - - [14/Oct/2025:10:19:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:19:58 localhost podman[248187]: @ - - [14/Oct/2025:10:19:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19869 "" "Go-http-client/1.1" Oct 14 06:19:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:19:59 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/809224712' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:19:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:19:59 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/809224712' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:19:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:19:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:19:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:19:59 localhost systemd[1]: tmp-crun.c5tLqh.mount: Deactivated successfully. Oct 14 06:19:59 localhost podman[337657]: 2025-10-14 10:19:59.731726298 +0000 UTC m=+0.064311563 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:19:59 localhost podman[337657]: 2025-10-14 10:19:59.736263939 +0000 UTC m=+0.068849274 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Oct 14 06:19:59 localhost podman[337656]: 2025-10-14 10:19:59.750600683 +0000 UTC m=+0.084589911 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:19:59 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:19:59 localhost podman[337656]: 2025-10-14 10:19:59.785239145 +0000 UTC m=+0.119228353 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:19:59 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:20:00 localhost nova_compute[297686]: 2025-10-14 10:19:59.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:00 localhost ceph-mon[317114]: overall HEALTH_OK Oct 14 06:20:00 localhost nova_compute[297686]: 2025-10-14 10:20:00.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e193 e193: 6 total, 6 up, 6 in Oct 14 06:20:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:05 localhost nova_compute[297686]: 2025-10-14 10:20:05.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:05 localhost nova_compute[297686]: 2025-10-14 10:20:05.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:20:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:20:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:20:05 localhost podman[337700]: 2025-10-14 10:20:05.731143636 +0000 UTC m=+0.069006419 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3) Oct 14 06:20:05 localhost podman[337700]: 2025-10-14 10:20:05.740818215 +0000 UTC m=+0.078681008 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0) Oct 14 06:20:05 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:20:05 localhost podman[337698]: 2025-10-14 10:20:05.794909371 +0000 UTC m=+0.134448935 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251009) Oct 14 06:20:05 localhost podman[337698]: 2025-10-14 10:20:05.808114989 +0000 UTC m=+0.147654583 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:20:05 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:20:05 localhost podman[337699]: 2025-10-14 10:20:05.899860741 +0000 UTC m=+0.237054824 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:20:05 localhost podman[337699]: 2025-10-14 10:20:05.909104887 +0000 UTC m=+0.246298930 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:20:05 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:20:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e194 e194: 6 total, 6 up, 6 in Oct 14 06:20:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:20:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2706501934' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:20:08 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:20:08 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2706501934' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:20:08 localhost openstack_network_exporter[250374]: ERROR 10:20:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:20:08 localhost openstack_network_exporter[250374]: ERROR 10:20:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:20:08 localhost openstack_network_exporter[250374]: ERROR 10:20:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:20:08 localhost openstack_network_exporter[250374]: Oct 14 06:20:08 localhost openstack_network_exporter[250374]: ERROR 10:20:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:20:08 localhost openstack_network_exporter[250374]: ERROR 10:20:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:20:08 localhost openstack_network_exporter[250374]: Oct 14 06:20:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:10 localhost nova_compute[297686]: 2025-10-14 10:20:10.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:10 localhost nova_compute[297686]: 2025-10-14 10:20:10.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:11 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:20:11 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1205374245' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:20:11 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:20:11 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1205374245' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:20:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e195 e195: 6 total, 6 up, 6 in Oct 14 06:20:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:15 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e196 e196: 6 total, 6 up, 6 in Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.034784) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437215034825, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2377, "num_deletes": 261, "total_data_size": 2966442, "memory_usage": 3015936, "flush_reason": "Manual Compaction"} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437215044144, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1926252, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25136, "largest_seqno": 27508, "table_properties": {"data_size": 1916576, "index_size": 5929, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 24152, "raw_average_key_size": 22, "raw_value_size": 1895830, "raw_average_value_size": 1768, "num_data_blocks": 252, "num_entries": 1072, "num_filter_entries": 1072, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437109, "oldest_key_time": 1760437109, "file_creation_time": 1760437215, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 9400 microseconds, and 3410 cpu microseconds. Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.044184) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1926252 bytes OK Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.044204) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.050035) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.050119) EVENT_LOG_v1 {"time_micros": 1760437215050102, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.050161) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 2955098, prev total WAL file size 2955098, number of live WAL files 2. Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.051272) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1881KB)], [39(16MB)] Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437215051363, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 18795399, "oldest_snapshot_seqno": -1} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 13423 keys, 17618431 bytes, temperature: kUnknown Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437215134089, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 17618431, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17541743, "index_size": 42027, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33605, "raw_key_size": 362144, "raw_average_key_size": 26, "raw_value_size": 17313119, "raw_average_value_size": 1289, "num_data_blocks": 1562, "num_entries": 13423, "num_filter_entries": 13423, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437215, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.134369) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 17618431 bytes Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.135911) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 226.9 rd, 212.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 16.1 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(18.9) write-amplify(9.1) OK, records in: 13971, records dropped: 548 output_compression: NoCompression Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.135930) EVENT_LOG_v1 {"time_micros": 1760437215135918, "job": 22, "event": "compaction_finished", "compaction_time_micros": 82824, "compaction_time_cpu_micros": 46459, "output_level": 6, "num_output_files": 1, "total_output_size": 17618431, "num_input_records": 13971, "num_output_records": 13423, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437215136146, "job": 22, "event": "table_file_deletion", "file_number": 41} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437215137184, "job": 22, "event": "table_file_deletion", "file_number": 39} Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.051175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.137283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.137291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.137292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.137294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:20:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:20:15.137296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:20:15 localhost nova_compute[297686]: 2025-10-14 10:20:15.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:15 localhost nova_compute[297686]: 2025-10-14 10:20:15.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Oct 14 06:20:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c9f8ac4b-082a-4fd7-a4ce-57524c5e0629/b0a90cce-d7c2-42b6-a80f-3807b51966b8", "osd", "allow rw pool=manila_data namespace=fsvolumens_c9f8ac4b-082a-4fd7-a4ce-57524c5e0629", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c9f8ac4b-082a-4fd7-a4ce-57524c5e0629/b0a90cce-d7c2-42b6-a80f-3807b51966b8", "osd", "allow rw pool=manila_data namespace=fsvolumens_c9f8ac4b-082a-4fd7-a4ce-57524c5e0629", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:17 localhost ovn_controller[157396]: 2025-10-14T10:20:17Z|00287|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory Oct 14 06:20:18 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e197 e197: 6 total, 6 up, 6 in Oct 14 06:20:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e198 e198: 6 total, 6 up, 6 in Oct 14 06:20:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e199 e199: 6 total, 6 up, 6 in Oct 14 06:20:20 localhost nova_compute[297686]: 2025-10-14 10:20:20.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:20 localhost nova_compute[297686]: 2025-10-14 10:20:20.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:20:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:20:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:20:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Oct 14 06:20:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f/a86d1791-a890-4116-b06d-8abaae958634", "osd", "allow rw pool=manila_data namespace=fsvolumens_0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f/a86d1791-a890-4116-b06d-8abaae958634", "osd", "allow rw pool=manila_data namespace=fsvolumens_0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:20 localhost podman[337755]: 2025-10-14 10:20:20.761226927 +0000 UTC m=+0.100672709 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:20:20 localhost podman[337757]: 2025-10-14 10:20:20.814400384 +0000 UTC m=+0.150971047 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:20:20 localhost podman[337755]: 2025-10-14 10:20:20.819567744 +0000 UTC m=+0.159013506 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:20:20 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:20:20 localhost podman[337757]: 2025-10-14 10:20:20.847340415 +0000 UTC m=+0.183911058 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 06:20:20 localhost systemd[1]: tmp-crun.INQWOe.mount: Deactivated successfully. Oct 14 06:20:20 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:20:20 localhost podman[337756]: 2025-10-14 10:20:20.869709338 +0000 UTC m=+0.207278051 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Oct 14 06:20:20 localhost podman[337756]: 2025-10-14 10:20:20.908289873 +0000 UTC m=+0.245858646 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public) Oct 14 06:20:20 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:20:21 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e200 e200: 6 total, 6 up, 6 in Oct 14 06:20:22 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Oct 14 06:20:22 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e201 e201: 6 total, 6 up, 6 in Oct 14 06:20:23 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e202 e202: 6 total, 6 up, 6 in Oct 14 06:20:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Oct 14 06:20:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f/a86d1791-a890-4116-b06d-8abaae958634", "osd", "allow rw pool=manila_data namespace=fsvolumens_0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:24 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f/a86d1791-a890-4116-b06d-8abaae958634", "osd", "allow rw pool=manila_data namespace=fsvolumens_0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e203 e203: 6 total, 6 up, 6 in Oct 14 06:20:25 localhost nova_compute[297686]: 2025-10-14 10:20:25.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:20:25 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1039955162' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:20:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:20:25 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1039955162' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:20:25 localhost nova_compute[297686]: 2025-10-14 10:20:25.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1765384422", "format": "json"} : dispatch Oct 14 06:20:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1765384422", "caps": ["mds", "allow rw path=/volumes/_nogroup/0eb88f0a-5596-479e-940f-8e6a2102a41a/ecf8dc76-3b91-46d6-9446-c258a56234b9", "osd", "allow rw pool=manila_data namespace=fsvolumens_0eb88f0a-5596-479e-940f-8e6a2102a41a", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1765384422", "caps": ["mds", "allow rw path=/volumes/_nogroup/0eb88f0a-5596-479e-940f-8e6a2102a41a/ecf8dc76-3b91-46d6-9446-c258a56234b9", "osd", "allow rw pool=manila_data namespace=fsvolumens_0eb88f0a-5596-479e-940f-8e6a2102a41a", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:26 localhost ovn_metadata_agent[163050]: 2025-10-14 10:20:26.355 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:20:26 localhost ovn_metadata_agent[163050]: 2025-10-14 10:20:26.356 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:20:26 localhost nova_compute[297686]: 2025-10-14 10:20:26.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Oct 14 06:20:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Oct 14 06:20:28 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Oct 14 06:20:28 localhost podman[248187]: time="2025-10-14T10:20:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:20:28 localhost podman[248187]: @ - - [14/Oct/2025:10:20:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:20:28 localhost podman[248187]: @ - - [14/Oct/2025:10:20:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19870 "" "Go-http-client/1.1" Oct 14 06:20:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e204 e204: 6 total, 6 up, 6 in Oct 14 06:20:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e205 e205: 6 total, 6 up, 6 in Oct 14 06:20:30 localhost nova_compute[297686]: 2025-10-14 10:20:30.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:30 localhost ovn_metadata_agent[163050]: 2025-10-14 10:20:30.359 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:20:30 localhost nova_compute[297686]: 2025-10-14 10:20:30.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:20:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:20:30 localhost podman[337817]: 2025-10-14 10:20:30.743175367 +0000 UTC m=+0.084681573 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:20:30 localhost podman[337818]: 2025-10-14 10:20:30.812913207 +0000 UTC m=+0.148174530 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:20:30 localhost podman[337817]: 2025-10-14 10:20:30.829707248 +0000 UTC m=+0.171213484 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:20:30 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:20:30 localhost podman[337818]: 2025-10-14 10:20:30.847199749 +0000 UTC m=+0.182461082 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:20:30 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:20:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Oct 14 06:20:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f/a86d1791-a890-4116-b06d-8abaae958634", "osd", "allow rw pool=manila_data namespace=fsvolumens_0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:31 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f/a86d1791-a890-4116-b06d-8abaae958634", "osd", "allow rw pool=manila_data namespace=fsvolumens_0d9f7a76-cbf2-46e5-ab0a-bc125ebb379f", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e206 e206: 6 total, 6 up, 6 in Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.257 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.367 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.367 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.368 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:20:32 localhost nova_compute[297686]: 2025-10-14 10:20:32.368 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:20:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:20:32.393 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:20:32Z, description=, device_id=33f3780b-b723-420e-9562-e93dddfcf9be, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0c4b2107-07fc-4150-8702-7593f3871b2d, ip_allocation=immediate, mac_address=fa:16:3e:8b:14:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3175, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:20:32Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:20:32 localhost podman[337873]: 2025-10-14 10:20:32.6068571 +0000 UTC m=+0.052917779 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:20:32 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:20:32 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:20:32 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:20:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:20:32.894 271987 INFO neutron.agent.dhcp.agent [None req-fe11a1af-f82a-49c2-886d-25ff11b3741a - - - - - -] DHCP configuration for ports {'0c4b2107-07fc-4150-8702-7593f3871b2d'} is completed#033[00m Oct 14 06:20:33 localhost nova_compute[297686]: 2025-10-14 10:20:33.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1765384422", "format": "json"} : dispatch Oct 14 06:20:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1765384422"} : dispatch Oct 14 06:20:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1765384422"}]': finished Oct 14 06:20:33 localhost nova_compute[297686]: 2025-10-14 10:20:33.179 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:20:33 localhost nova_compute[297686]: 2025-10-14 10:20:33.202 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:20:33 localhost nova_compute[297686]: 2025-10-14 10:20:33.202 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:20:33 localhost nova_compute[297686]: 2025-10-14 10:20:33.202 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:33 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:20:33 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3631508493' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:20:33 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:20:33 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3631508493' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:20:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:20:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:20:34 localhost nova_compute[297686]: 2025-10-14 10:20:34.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:34 localhost nova_compute[297686]: 2025-10-14 10:20:34.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:35 localhost nova_compute[297686]: 2025-10-14 10:20:35.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:35 localhost nova_compute[297686]: 2025-10-14 10:20:35.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:35 localhost nova_compute[297686]: 2025-10-14 10:20:35.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:20:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Oct 14 06:20:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Oct 14 06:20:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Oct 14 06:20:36 localhost nova_compute[297686]: 2025-10-14 10:20:36.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:36 localhost nova_compute[297686]: 2025-10-14 10:20:36.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:20:36 localhost nova_compute[297686]: 2025-10-14 10:20:36.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:36 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Oct 14 06:20:36 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Oct 14 06:20:36 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Oct 14 06:20:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:20:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:20:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:20:36 localhost podman[337982]: 2025-10-14 10:20:36.733378911 +0000 UTC m=+0.071519876 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:20:36 localhost podman[337982]: 2025-10-14 10:20:36.746191767 +0000 UTC m=+0.084332662 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:20:36 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:20:36 localhost podman[337980]: 2025-10-14 10:20:36.796605139 +0000 UTC m=+0.140869914 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd) Oct 14 06:20:36 localhost podman[337980]: 2025-10-14 10:20:36.809939672 +0000 UTC m=+0.154204447 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:20:36 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:20:36 localhost podman[337981]: 2025-10-14 10:20:36.849605671 +0000 UTC m=+0.185164536 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:20:36 localhost podman[337981]: 2025-10-14 10:20:36.857721292 +0000 UTC m=+0.193280247 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:20:36 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.279 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.279 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.279 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.280 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.280 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:20:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e207 e207: 6 total, 6 up, 6 in Oct 14 06:20:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:20:37 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/70856006' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.757 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.820 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:20:37 localhost nova_compute[297686]: 2025-10-14 10:20:37.821 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.001 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.003 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11182MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.003 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.003 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.090 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.090 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.092 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.164 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:20:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e208 e208: 6 total, 6 up, 6 in Oct 14 06:20:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:20:38 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2928004055' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.598 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.606 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.630 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.633 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:20:38 localhost nova_compute[297686]: 2025-10-14 10:20:38.634 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:20:38 localhost openstack_network_exporter[250374]: ERROR 10:20:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:20:38 localhost openstack_network_exporter[250374]: ERROR 10:20:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:20:38 localhost openstack_network_exporter[250374]: ERROR 10:20:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:20:38 localhost openstack_network_exporter[250374]: ERROR 10:20:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:20:38 localhost openstack_network_exporter[250374]: Oct 14 06:20:38 localhost openstack_network_exporter[250374]: ERROR 10:20:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:20:38 localhost openstack_network_exporter[250374]: Oct 14 06:20:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:39 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Oct 14 06:20:39 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Oct 14 06:20:39 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Oct 14 06:20:39 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Oct 14 06:20:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e209 e209: 6 total, 6 up, 6 in Oct 14 06:20:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e210 e210: 6 total, 6 up, 6 in Oct 14 06:20:40 localhost nova_compute[297686]: 2025-10-14 10:20:40.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:40 localhost nova_compute[297686]: 2025-10-14 10:20:40.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:40 localhost nova_compute[297686]: 2025-10-14 10:20:40.634 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:40 localhost nova_compute[297686]: 2025-10-14 10:20:40.635 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:20:42 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Oct 14 06:20:42 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/343c69e2-b2b0-4638-b91f-68171be807ee/d81f744d-ecd5-450e-b4f4-9060c1e362ff", "osd", "allow rw pool=manila_data namespace=fsvolumens_343c69e2-b2b0-4638-b91f-68171be807ee", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:42 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/343c69e2-b2b0-4638-b91f-68171be807ee/d81f744d-ecd5-450e-b4f4-9060c1e362ff", "osd", "allow rw pool=manila_data namespace=fsvolumens_343c69e2-b2b0-4638-b91f-68171be807ee", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:45 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e211 e211: 6 total, 6 up, 6 in Oct 14 06:20:45 localhost nova_compute[297686]: 2025-10-14 10:20:45.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:45 localhost nova_compute[297686]: 2025-10-14 10:20:45.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:46 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e212 e212: 6 total, 6 up, 6 in Oct 14 06:20:47 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e213 e213: 6 total, 6 up, 6 in Oct 14 06:20:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e214 e214: 6 total, 6 up, 6 in Oct 14 06:20:49 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Oct 14 06:20:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.824 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.825 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.846 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c215a67-20ac-405a-ada3-d9426d152b6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:20:49.825905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '74c51af6-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.039028481, 'message_signature': 'cbe39666eb9a84215d39030504464cdf902885c666d843c4aeaf63297906f58d'}]}, 'timestamp': '2025-10-14 10:20:49.847487', '_unique_id': 'e4b57ec6438a4450bc960b548ced904c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.850 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.854 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0905769-6ae7-4dde-a076-9cda8ba0c97c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.850558', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74c635ee-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': '48bbddaed62c6f50308d1de897eaa0722f9427845a475c0c43edfb4e941adea1'}]}, 'timestamp': '2025-10-14 10:20:49.854661', '_unique_id': '8d41a1f8f21743b0a2e362f97e069aa4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.855 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.857 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4a9a492-0560-413b-a987-b1a75f6853e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.857419', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74c6b744-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'ac5def0622648b7e4882a100570002db76433d84c3a734b776d10bd69bd434c7'}]}, 'timestamp': '2025-10-14 10:20:49.858009', '_unique_id': 'b8fbbc8494e14eda99164ccc6a747ce9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.858 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.860 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.860 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e883102-b009-41c5-bc2f-f4269802a160', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.860654', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74c736ba-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'bda8534b4522b6b1a5967655c36c968e1b1a3005b7ad544b335cd7829435cb28'}]}, 'timestamp': '2025-10-14 10:20:49.861283', '_unique_id': 'aed49cb45d1241988cb7afae9ac9ac9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.862 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.863 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.863 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e705905-aec9-404f-95e2-b8d5734aae26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.863513', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74c7a3de-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'bdc9c8d7e1084a5d93a0eefe07e2f2969fb27d8469e246c6ebd58bac81ea4464'}]}, 'timestamp': '2025-10-14 10:20:49.864021', '_unique_id': '47f684b61fd14450a6b7da69b827f84c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.866 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.866 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 18120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c675f920-11d2-4842-91ad-fc54e5aa0d47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18120000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:20:49.866286', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '74c80f5e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.039028481, 'message_signature': '80f02c4c61c569d76c7952258d97727107799c5748eff32d686114ef4cef5001'}]}, 'timestamp': '2025-10-14 10:20:49.866789', '_unique_id': '9552e587c0784a77a98be51ce38af5e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.867 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.880 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '095bef3a-f3c5-4ebc-ad9d-b14e0de31c1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.869081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74ca2e6a-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.061790167, 'message_signature': '2eb6aba2e902d9079bfd3d4df9b6adfc957bae64b1793fe73f19ee441ec1eb5d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.869081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74ca413e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.061790167, 'message_signature': 'e3357d14c0a5d9d78002ce6a5bd1b0b0afc68d2b0a65f7d21d425a9e3d7c2a54'}]}, 'timestamp': '2025-10-14 10:20:49.881126', '_unique_id': 'b2a34c622fbf4e98a37bfa3c85f5b8b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.882 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.883 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e3bd555-f7f4-458c-bdf6-1bc3520694dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.883524', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74cab2fe-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'a1b69e1de896cac832208a98e0f68427fbf44d4b863b2563d973a11e53964a41'}]}, 'timestamp': '2025-10-14 10:20:49.884078', '_unique_id': '7869a4d9002b41efb9d5f3796f3918ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.886 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b06fabf-a329-4d7a-a9d4-8003c174eb49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.886490', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74cb269e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'ddda4b5f8596285f47cbbcad9c6a89ac31486680569aa0098d7b4f994cba71a7'}]}, 'timestamp': '2025-10-14 10:20:49.887029', '_unique_id': '51b728baebf646f69051b225fae6c277'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.887 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.908 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dd4f601-b078-4ca1-9dee-90eacd8a0cf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.889308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74ce9040-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'cb7b0b5e1b58a0a5d43aefbfaf0082e05cecba09ebfa8304e26b1751adfae164'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.889308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74cea2a6-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'dc9ac2c46bd0eb56486dbe5e38767174c7de1c2ffaf721ff41ff5cf54fb64112'}]}, 'timestamp': '2025-10-14 10:20:49.909910', '_unique_id': '82d1e91f73f84ef6ad61a5a148221fca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97eb227c-c518-427c-84c6-4b07262125b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.912319', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74cf151a-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'fd6057b51a79e7b94d3b9491eb9026e88af246ce64750fbe4091dcd7a4a7868f'}]}, 'timestamp': '2025-10-14 10:20:49.912823', '_unique_id': '70731e340c254405bc47d982026efb69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.913 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0bacaca-2372-430f-9bd7-8dadacf3d705', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.915368', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74cf8c0c-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': 'b3eb9ff2ba25960ff78031100a6e0844ee6fe6b451726c7554765c671159a9fd'}]}, 'timestamp': '2025-10-14 10:20:49.915869', '_unique_id': 'cf4e5ca8da3e47ecb573247759efcec6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.916 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28ae5357-0e05-4f2f-b478-92162885e9ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.918278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74cffe58-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'b0b31a735e6926e7225f8476b3ab8eebc783fff907bda93e847344814bc4d844'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.918278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d012b2-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'df39deb400837d12a855dd6e31b7da9c7b9ab6876cc2f3a489aab6383cbe8088'}]}, 'timestamp': '2025-10-14 10:20:49.919284', '_unique_id': '12154ae65edb41cbb94418cde2b38e85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bfd605e-448b-4e7f-8941-ab0cccd00553', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.921468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74d07b9e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': '03de8f04328c70d5796f7e131ac7aeed17953ce012eab3e7f8cbbb837d0f7909'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.921468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d09020-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'b85adfda8e761540fe8fee4c05712c10cc52fb729eaad9998795123e0eba85b4'}]}, 'timestamp': '2025-10-14 10:20:49.922483', '_unique_id': 'bc2822351bd241d3ba08f43445e2f969'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.925 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7ce8f75-ec46-4ad3-a2ab-179536fd5cab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.924763', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74d0faec-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.061790167, 'message_signature': 'c7fb7c22ca912908af337a5e314784dcb0ffaee85a8b78538fe725991c92c04f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.924763', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d10b54-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.061790167, 'message_signature': '35fa21cdf4819a1847ffbf27c43c5d36743c7b26e0029718f1faccd11a5ae416'}]}, 'timestamp': '2025-10-14 10:20:49.925616', '_unique_id': 'cbd61fc1d2bd4579b7bd1ab3c319e77e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.927 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6000cde-9357-4eba-bdcc-752a058282e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.927879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74d174cc-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': '1de9fc9f60dcb75a1523a9723f090df678c961251e76f8e67cfd67a452a985cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.927879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d18502-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'a66a7d3f26cbada84482dd4176a26718dbce864538ab6303358dad08dc55fe56'}]}, 'timestamp': '2025-10-14 10:20:49.928760', '_unique_id': '1cbcc5afab424e9488f88b76f8852f4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.929 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.931 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.931 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '655de627-79be-476f-ad60-3b7c140cb825', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.931242', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74d1fa00-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'c3b99bbe2b1fcdc77f527a1a523d0fc92bca1d67069023fb4e4faabd8a2a8e92'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.931242', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d20f9a-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': 'cd68ccdb4095662b21a1f7b399f379a7c8d6ba65c1e44cf3fcc266e4804e707c'}]}, 'timestamp': '2025-10-14 10:20:49.932294', '_unique_id': 'db0cb6f4b88a43338d857cfd9c3f683e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.933 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.934 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6dd20fa-3956-4244-ba2d-584a86d3cca6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.934517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74d27a52-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.061790167, 'message_signature': '8beddb1443c4ef68c6f05312753b6e1cae4445315c85cc97c9e844795121f13b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.934517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d2888a-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.061790167, 'message_signature': '907a591297da878b906b5f72e16bc890368eed60dc57384ee7ee091a02b661d1'}]}, 'timestamp': '2025-10-14 10:20:49.935296', '_unique_id': 'fee226338653440ea6d818a607ffe882'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.936 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.936 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a60cd625-a44f-4095-8fea-22c079fa2540', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:20:49.936632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74d2c7fa-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': '4d954e4570e6f3b79844822c527fbf713b44655167223a1ae7aa588a1e1b3896'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:20:49.936632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74d2d1be-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.082021213, 'message_signature': '71164bc5dbfbc238891a3d4588848e973f026441f68521b9bb9a5a6e75a3958e'}]}, 'timestamp': '2025-10-14 10:20:49.937169', '_unique_id': '78a364361d724c6ea51cbf571f760561'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.937 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.938 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1630821-f52b-402a-865c-1b69885b4b75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.938532', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74d31250-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': '883e770d401a89523717ff03e963dfe6ba747c5e664720859bc3f43358d23b3b'}]}, 'timestamp': '2025-10-14 10:20:49.938859', '_unique_id': '70ef794e849c450e8a38631c2f7e3a14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.940 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4539ac33-3e24-4147-9b18-3dbc2b159726', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:20:49.940166', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '74d35152-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13266.043272253, 'message_signature': '270590f325c995e450adf7874989b53adceb0a3cfd68347d133edc6ff639d93c'}]}, 'timestamp': '2025-10-14 10:20:49.940456', '_unique_id': 'a8051ab7aa3d4f678a0218004b0cda0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:20:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:20:49.941 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:20:50 localhost nova_compute[297686]: 2025-10-14 10:20:50.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:50 localhost nova_compute[297686]: 2025-10-14 10:20:50.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:20:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:20:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:20:51 localhost podman[338087]: 2025-10-14 10:20:51.744082805 +0000 UTC m=+0.080941878 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:20:51 localhost systemd[1]: tmp-crun.wiAtmd.mount: Deactivated successfully. Oct 14 06:20:51 localhost podman[338088]: 2025-10-14 10:20:51.834193906 +0000 UTC m=+0.163305849 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm) Oct 14 06:20:51 localhost podman[338088]: 2025-10-14 10:20:51.841044007 +0000 UTC m=+0.170155940 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:20:51 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:20:51 localhost podman[338087]: 2025-10-14 10:20:51.85401783 +0000 UTC m=+0.190876883 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:20:51 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:20:51 localhost podman[338089]: 2025-10-14 10:20:51.906388142 +0000 UTC m=+0.236428485 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:20:51 localhost podman[338089]: 2025-10-14 10:20:51.914442422 +0000 UTC m=+0.244482765 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:20:51 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:20:52 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:20:52 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:52 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:20:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e215 e215: 6 total, 6 up, 6 in Oct 14 06:20:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:20:55 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1068749509' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:20:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:20:55 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1068749509' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:20:55 localhost nova_compute[297686]: 2025-10-14 10:20:55.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:55 localhost nova_compute[297686]: 2025-10-14 10:20:55.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:20:55 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:20:55 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:20:55 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:20:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:20:55 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1108498055' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:20:55 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:20:55 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1108498055' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:20:56 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Oct 14 06:20:56 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Oct 14 06:20:56 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Oct 14 06:20:56 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e216 e216: 6 total, 6 up, 6 in Oct 14 06:20:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:20:57.787 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:20:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:20:57.788 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:20:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:20:57.789 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:20:58 localhost podman[248187]: time="2025-10-14T10:20:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:20:58 localhost podman[248187]: @ - - [14/Oct/2025:10:20:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:20:58 localhost podman[248187]: @ - - [14/Oct/2025:10:20:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19871 "" "Go-http-client/1.1" Oct 14 06:20:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e217 e217: 6 total, 6 up, 6 in Oct 14 06:20:58 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:20:58 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:20:58 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:20:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:00 localhost nova_compute[297686]: 2025-10-14 10:21:00.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:00 localhost nova_compute[297686]: 2025-10-14 10:21:00.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:01 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e218 e218: 6 total, 6 up, 6 in Oct 14 06:21:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:21:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:21:01 localhost systemd[1]: tmp-crun.8efz1f.mount: Deactivated successfully. Oct 14 06:21:01 localhost podman[338151]: 2025-10-14 10:21:01.742580046 +0000 UTC m=+0.088211263 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Oct 14 06:21:01 localhost podman[338151]: 2025-10-14 10:21:01.776093624 +0000 UTC m=+0.121724791 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:21:01 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:21:01 localhost podman[338150]: 2025-10-14 10:21:01.828856659 +0000 UTC m=+0.170754020 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:21:01 localhost podman[338150]: 2025-10-14 10:21:01.8379542 +0000 UTC m=+0.179851541 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:21:01 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:21:01 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:21:01.952 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:21:01Z, description=, device_id=3cf1c66f-ba39-44a2-95bc-f46e66200037, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ba980171-0a97-4860-aac9-0f5a11558c1d, ip_allocation=immediate, mac_address=fa:16:3e:0f:6b:84, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3294, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:21:01Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:21:02 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:21:02 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:21:02 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:21:02 localhost podman[338206]: 2025-10-14 10:21:02.15660115 +0000 UTC m=+0.056642796 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:21:02 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:21:02 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:21:02 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:21:02 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:21:02.431 271987 INFO neutron.agent.dhcp.agent [None req-dc09c49c-ad85-42af-aead-5ba54f39c9ba - - - - - -] DHCP configuration for ports {'ba980171-0a97-4860-aac9-0f5a11558c1d'} is completed#033[00m Oct 14 06:21:03 localhost nova_compute[297686]: 2025-10-14 10:21:03.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e219 e219: 6 total, 6 up, 6 in Oct 14 06:21:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:05 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e220 e220: 6 total, 6 up, 6 in Oct 14 06:21:05 localhost nova_compute[297686]: 2025-10-14 10:21:05.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:05 localhost nova_compute[297686]: 2025-10-14 10:21:05.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:05 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:05 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:05 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:21:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:21:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:21:07 localhost podman[338228]: 2025-10-14 10:21:07.757594138 +0000 UTC m=+0.090591987 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:21:07 localhost podman[338228]: 2025-10-14 10:21:07.76509966 +0000 UTC m=+0.098097479 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:21:07 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:21:07 localhost systemd[1]: tmp-crun.drXe4D.mount: Deactivated successfully. Oct 14 06:21:07 localhost podman[338227]: 2025-10-14 10:21:07.802777967 +0000 UTC m=+0.139334807 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:21:07 localhost podman[338227]: 2025-10-14 10:21:07.813170369 +0000 UTC m=+0.149727189 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd) Oct 14 06:21:07 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:21:07 localhost podman[338229]: 2025-10-14 10:21:07.855343605 +0000 UTC m=+0.183979859 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=iscsid) Oct 14 06:21:07 localhost podman[338229]: 2025-10-14 10:21:07.870263357 +0000 UTC m=+0.198899651 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:21:07 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:21:08 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:08 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:21:08 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:21:08 localhost openstack_network_exporter[250374]: ERROR 10:21:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:21:08 localhost openstack_network_exporter[250374]: ERROR 10:21:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:21:08 localhost openstack_network_exporter[250374]: ERROR 10:21:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:21:08 localhost openstack_network_exporter[250374]: ERROR 10:21:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:21:08 localhost openstack_network_exporter[250374]: Oct 14 06:21:08 localhost openstack_network_exporter[250374]: ERROR 10:21:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:21:08 localhost openstack_network_exporter[250374]: Oct 14 06:21:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e221 e221: 6 total, 6 up, 6 in Oct 14 06:21:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e222 e222: 6 total, 6 up, 6 in Oct 14 06:21:10 localhost nova_compute[297686]: 2025-10-14 10:21:10.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:10 localhost nova_compute[297686]: 2025-10-14 10:21:10.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:11 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:11 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:11 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e223 e223: 6 total, 6 up, 6 in Oct 14 06:21:15 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e224 e224: 6 total, 6 up, 6 in Oct 14 06:21:15 localhost nova_compute[297686]: 2025-10-14 10:21:15.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:15 localhost nova_compute[297686]: 2025-10-14 10:21:15.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:15 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:15 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:21:15 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:21:17 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e225 e225: 6 total, 6 up, 6 in Oct 14 06:21:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:21:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e226 e226: 6 total, 6 up, 6 in Oct 14 06:21:20 localhost nova_compute[297686]: 2025-10-14 10:21:20.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:20 localhost nova_compute[297686]: 2025-10-14 10:21:20.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:21:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:21:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:21:22 localhost podman[338290]: 2025-10-14 10:21:22.762253376 +0000 UTC m=+0.096987665 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:21:22 localhost podman[338292]: 2025-10-14 10:21:22.862083148 +0000 UTC m=+0.186311402 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute) Oct 14 06:21:22 localhost podman[338291]: 2025-10-14 10:21:22.828618432 +0000 UTC m=+0.156992354 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.) Oct 14 06:21:22 localhost podman[338290]: 2025-10-14 10:21:22.890151938 +0000 UTC m=+0.224886247 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller) Oct 14 06:21:22 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:21:22 localhost podman[338291]: 2025-10-14 10:21:22.912078797 +0000 UTC m=+0.240452679 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container) Oct 14 06:21:22 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:21:22 localhost podman[338292]: 2025-10-14 10:21:22.945788311 +0000 UTC m=+0.270016585 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:21:22 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:21:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:21:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:21:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:21:23 localhost ovn_controller[157396]: 2025-10-14T10:21:23Z|00288|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:21:23 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e227 e227: 6 total, 6 up, 6 in Oct 14 06:21:23 localhost nova_compute[297686]: 2025-10-14 10:21:23.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:23 localhost podman[338372]: 2025-10-14 10:21:23.749178263 +0000 UTC m=+0.061625520 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:21:23 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:21:23 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:21:23 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:21:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:21:24 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1264590158' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:21:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:21:24 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1264590158' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:21:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e228 e228: 6 total, 6 up, 6 in Oct 14 06:21:25 localhost nova_compute[297686]: 2025-10-14 10:21:25.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:25 localhost nova_compute[297686]: 2025-10-14 10:21:25.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:26 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e229 e229: 6 total, 6 up, 6 in Oct 14 06:21:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:21:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:27 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:27 localhost ovn_controller[157396]: 2025-10-14T10:21:27Z|00289|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:21:27 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:21:27 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:21:27 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:21:27 localhost podman[338409]: 2025-10-14 10:21:27.282294745 +0000 UTC m=+0.049972039 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:21:27 localhost nova_compute[297686]: 2025-10-14 10:21:27.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:28 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e230 e230: 6 total, 6 up, 6 in Oct 14 06:21:28 localhost podman[248187]: time="2025-10-14T10:21:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:21:28 localhost podman[248187]: @ - - [14/Oct/2025:10:21:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:21:28 localhost podman[248187]: @ - - [14/Oct/2025:10:21:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19872 "" "Go-http-client/1.1" Oct 14 06:21:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:21:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:21:30 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:21:30 localhost nova_compute[297686]: 2025-10-14 10:21:30.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:30 localhost nova_compute[297686]: 2025-10-14 10:21:30.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:31 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e231 e231: 6 total, 6 up, 6 in Oct 14 06:21:32 localhost nova_compute[297686]: 2025-10-14 10:21:32.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:32 localhost nova_compute[297686]: 2025-10-14 10:21:32.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:32 localhost nova_compute[297686]: 2025-10-14 10:21:32.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 06:21:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:21:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:21:32 localhost ovn_metadata_agent[163050]: 2025-10-14 10:21:32.729 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:21:32 localhost ovn_metadata_agent[163050]: 2025-10-14 10:21:32.730 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:21:32 localhost ovn_metadata_agent[163050]: 2025-10-14 10:21:32.731 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:21:32 localhost systemd[1]: tmp-crun.wKdlAh.mount: Deactivated successfully. Oct 14 06:21:32 localhost nova_compute[297686]: 2025-10-14 10:21:32.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:32 localhost podman[338430]: 2025-10-14 10:21:32.762608262 +0000 UTC m=+0.104331832 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:21:32 localhost podman[338430]: 2025-10-14 10:21:32.791906449 +0000 UTC m=+0.133629979 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:21:32 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:21:32 localhost podman[338429]: 2025-10-14 10:21:32.763703596 +0000 UTC m=+0.104933461 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:21:32 localhost podman[338429]: 2025-10-14 10:21:32.843481437 +0000 UTC m=+0.184711282 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:21:32 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:21:33 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e232 e232: 6 total, 6 up, 6 in Oct 14 06:21:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.272 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.273 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.273 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:21:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:34 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.558 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.558 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.559 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.559 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.974 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.991 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.991 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:21:34 localhost nova_compute[297686]: 2025-10-14 10:21:34.992 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e233 e233: 6 total, 6 up, 6 in Oct 14 06:21:35 localhost nova_compute[297686]: 2025-10-14 10:21:35.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:35 localhost nova_compute[297686]: 2025-10-14 10:21:35.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:35 localhost nova_compute[297686]: 2025-10-14 10:21:35.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 06:21:35 localhost nova_compute[297686]: 2025-10-14 10:21:35.270 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 06:21:35 localhost nova_compute[297686]: 2025-10-14 10:21:35.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:21:35 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:21:35 localhost nova_compute[297686]: 2025-10-14 10:21:35.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:36 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e234 e234: 6 total, 6 up, 6 in Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.271 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.271 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.271 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.272 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.297 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.297 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.298 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.298 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.298 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:21:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:21:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:21:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:21:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e235 e235: 6 total, 6 up, 6 in Oct 14 06:21:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:21:37 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3664459616' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:21:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:21:37 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3664459616' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:21:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:21:37 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3873837423' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.772 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.847 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:21:37 localhost nova_compute[297686]: 2025-10-14 10:21:37.847 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.084 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.086 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11159MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.087 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.088 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.410 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.412 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.413 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.592 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 06:21:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:21:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:21:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:21:38 localhost podman[338576]: 2025-10-14 10:21:38.750816954 +0000 UTC m=+0.079622378 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009) Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.752 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.753 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 06:21:38 localhost podman[338576]: 2025-10-14 10:21:38.764309541 +0000 UTC m=+0.093114975 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd) Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.772 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 06:21:38 localhost openstack_network_exporter[250374]: ERROR 10:21:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:21:38 localhost openstack_network_exporter[250374]: ERROR 10:21:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:21:38 localhost openstack_network_exporter[250374]: ERROR 10:21:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:21:38 localhost openstack_network_exporter[250374]: ERROR 10:21:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:21:38 localhost openstack_network_exporter[250374]: Oct 14 06:21:38 localhost openstack_network_exporter[250374]: ERROR 10:21:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:21:38 localhost openstack_network_exporter[250374]: Oct 14 06:21:38 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.800 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 06:21:38 localhost systemd[1]: tmp-crun.13VtC0.mount: Deactivated successfully. Oct 14 06:21:38 localhost podman[338577]: 2025-10-14 10:21:38.821942896 +0000 UTC m=+0.150603085 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:21:38 localhost podman[338577]: 2025-10-14 10:21:38.833539675 +0000 UTC m=+0.162199854 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:21:38 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:21:38 localhost nova_compute[297686]: 2025-10-14 10:21:38.850 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:21:38 localhost systemd[1]: tmp-crun.JtdFOM.mount: Deactivated successfully. Oct 14 06:21:38 localhost podman[338578]: 2025-10-14 10:21:38.928327111 +0000 UTC m=+0.249305452 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:21:38 localhost podman[338578]: 2025-10-14 10:21:38.943427839 +0000 UTC m=+0.264406130 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:21:38 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:21:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:21:39 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/686964912' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:21:39 localhost nova_compute[297686]: 2025-10-14 10:21:39.279 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:21:39 localhost nova_compute[297686]: 2025-10-14 10:21:39.286 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:21:39 localhost nova_compute[297686]: 2025-10-14 10:21:39.302 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:21:39 localhost nova_compute[297686]: 2025-10-14 10:21:39.304 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:21:39 localhost nova_compute[297686]: 2025-10-14 10:21:39.305 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:21:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:39 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:21:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e236 e236: 6 total, 6 up, 6 in Oct 14 06:21:40 localhost nova_compute[297686]: 2025-10-14 10:21:40.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:40 localhost nova_compute[297686]: 2025-10-14 10:21:40.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:21:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:41 localhost nova_compute[297686]: 2025-10-14 10:21:41.290 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:41 localhost nova_compute[297686]: 2025-10-14 10:21:41.290 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:42 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e237 e237: 6 total, 6 up, 6 in Oct 14 06:21:43 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:21:43 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:21:43 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:21:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:45 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e238 e238: 6 total, 6 up, 6 in Oct 14 06:21:45 localhost nova_compute[297686]: 2025-10-14 10:21:45.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:45 localhost nova_compute[297686]: 2025-10-14 10:21:45.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:47 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:47 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:47 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:21:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1329755738' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:21:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:21:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1329755738' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:21:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e239 e239: 6 total, 6 up, 6 in Oct 14 06:21:50 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:50 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:21:50 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:21:50 localhost nova_compute[297686]: 2025-10-14 10:21:50.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:50 localhost nova_compute[297686]: 2025-10-14 10:21:50.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:53 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:53 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:21:53 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:21:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:21:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:21:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:21:53 localhost podman[338663]: 2025-10-14 10:21:53.745105023 +0000 UTC m=+0.076136454 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2) Oct 14 06:21:53 localhost podman[338663]: 2025-10-14 10:21:53.753979675 +0000 UTC m=+0.085011086 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS) Oct 14 06:21:53 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:21:53 localhost podman[338662]: 2025-10-14 10:21:53.805877095 +0000 UTC m=+0.135813493 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:21:53 localhost podman[338662]: 2025-10-14 10:21:53.817100199 +0000 UTC m=+0.147036607 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:21:53 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:21:53 localhost systemd[1]: tmp-crun.7aXZQd.mount: Deactivated successfully. Oct 14 06:21:53 localhost podman[338661]: 2025-10-14 10:21:53.890468187 +0000 UTC m=+0.226862782 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0) Oct 14 06:21:53 localhost podman[338661]: 2025-10-14 10:21:53.953130237 +0000 UTC m=+0.289524862 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:21:53 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:21:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:21:55 localhost nova_compute[297686]: 2025-10-14 10:21:55.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:55 localhost nova_compute[297686]: 2025-10-14 10:21:55.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:21:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:21:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:21:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:21:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:21:57.788 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:21:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:21:57.789 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:21:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:21:57.790 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:21:58 localhost ovn_controller[157396]: 2025-10-14T10:21:58Z|00290|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Oct 14 06:21:58 localhost podman[248187]: time="2025-10-14T10:21:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:21:58 localhost podman[248187]: @ - - [14/Oct/2025:10:21:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:21:58 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e240 e240: 6 total, 6 up, 6 in Oct 14 06:21:58 localhost podman[248187]: @ - - [14/Oct/2025:10:21:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19865 "" "Go-http-client/1.1" Oct 14 06:21:59 localhost nova_compute[297686]: 2025-10-14 10:21:59.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:21:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e241 e241: 6 total, 6 up, 6 in Oct 14 06:21:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:00 localhost nova_compute[297686]: 2025-10-14 10:22:00.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:00 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:00 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:00 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:00 localhost nova_compute[297686]: 2025-10-14 10:22:00.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.432130) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437322432218, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2953, "num_deletes": 272, "total_data_size": 3833382, "memory_usage": 3918592, "flush_reason": "Manual Compaction"} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437322447167, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 2491575, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27513, "largest_seqno": 30461, "table_properties": {"data_size": 2479605, "index_size": 7579, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3333, "raw_key_size": 29880, "raw_average_key_size": 22, "raw_value_size": 2454012, "raw_average_value_size": 1867, "num_data_blocks": 323, "num_entries": 1314, "num_filter_entries": 1314, "num_deletions": 272, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437215, "oldest_key_time": 1760437215, "file_creation_time": 1760437322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 15113 microseconds, and 8810 cpu microseconds. Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.447243) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 2491575 bytes OK Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.447278) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.449831) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.449857) EVENT_LOG_v1 {"time_micros": 1760437322449848, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.449890) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 3819255, prev total WAL file size 3819255, number of live WAL files 2. Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.451073) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end) Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(2433KB)], [42(16MB)] Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437322451126, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 20110006, "oldest_snapshot_seqno": -1} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 14186 keys, 18486453 bytes, temperature: kUnknown Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437322539667, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 18486453, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18402608, "index_size": 47264, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35525, "raw_key_size": 380678, "raw_average_key_size": 26, "raw_value_size": 18158570, "raw_average_value_size": 1280, "num_data_blocks": 1769, "num_entries": 14186, "num_filter_entries": 14186, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.540168) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 18486453 bytes Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.542323) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 226.6 rd, 208.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 16.8 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(15.5) write-amplify(7.4) OK, records in: 14737, records dropped: 551 output_compression: NoCompression Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.542358) EVENT_LOG_v1 {"time_micros": 1760437322542343, "job": 24, "event": "compaction_finished", "compaction_time_micros": 88758, "compaction_time_cpu_micros": 54911, "output_level": 6, "num_output_files": 1, "total_output_size": 18486453, "num_input_records": 14737, "num_output_records": 14186, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437322543043, "job": 24, "event": "table_file_deletion", "file_number": 44} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437322546775, "job": 24, "event": "table_file_deletion", "file_number": 42} Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.450969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.546914) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.546927) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.546932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.546937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:02 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:02.546941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:02 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Oct 14 06:22:02 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3898862865' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Oct 14 06:22:03 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:03 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:22:03 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:22:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:22:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:22:03 localhost podman[338727]: 2025-10-14 10:22:03.72589054 +0000 UTC m=+0.065443747 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS) Oct 14 06:22:03 localhost podman[338726]: 2025-10-14 10:22:03.781398021 +0000 UTC m=+0.125445485 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:22:03 localhost podman[338726]: 2025-10-14 10:22:03.793038228 +0000 UTC m=+0.137085682 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:22:03 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:22:03 localhost podman[338727]: 2025-10-14 10:22:03.809923955 +0000 UTC m=+0.149477172 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Oct 14 06:22:03 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:22:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e242 e242: 6 total, 6 up, 6 in Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.118254) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437325118335, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 308, "num_deletes": 252, "total_data_size": 90760, "memory_usage": 96000, "flush_reason": "Manual Compaction"} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437325122099, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 58735, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30462, "largest_seqno": 30769, "table_properties": {"data_size": 56723, "index_size": 187, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5856, "raw_average_key_size": 20, "raw_value_size": 52587, "raw_average_value_size": 183, "num_data_blocks": 8, "num_entries": 286, "num_filter_entries": 286, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437322, "oldest_key_time": 1760437322, "file_creation_time": 1760437325, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 3914 microseconds, and 1322 cpu microseconds. Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.122177) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 58735 bytes OK Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.122200) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.124144) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.124170) EVENT_LOG_v1 {"time_micros": 1760437325124161, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.124194) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 88520, prev total WAL file size 88844, number of live WAL files 2. Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.125913) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323537' seq:72057594037927935, type:22 .. '6D6772737461740034353130' seq:0, type:0; will stop at (end) Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(57KB)], [45(17MB)] Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437325125975, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 18545188, "oldest_snapshot_seqno": -1} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 13953 keys, 16422552 bytes, temperature: kUnknown Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437325216916, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 16422552, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16345134, "index_size": 41413, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34949, "raw_key_size": 376053, "raw_average_key_size": 26, "raw_value_size": 16109943, "raw_average_value_size": 1154, "num_data_blocks": 1523, "num_entries": 13953, "num_filter_entries": 13953, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437325, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.217179) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 16422552 bytes Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.218833) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.7 rd, 180.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 17.6 +0.0 blob) out(15.7 +0.0 blob), read-write-amplify(595.3) write-amplify(279.6) OK, records in: 14472, records dropped: 519 output_compression: NoCompression Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.218852) EVENT_LOG_v1 {"time_micros": 1760437325218844, "job": 26, "event": "compaction_finished", "compaction_time_micros": 91023, "compaction_time_cpu_micros": 48662, "output_level": 6, "num_output_files": 1, "total_output_size": 16422552, "num_input_records": 14472, "num_output_records": 13953, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437325218963, "job": 26, "event": "table_file_deletion", "file_number": 47} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437325220486, "job": 26, "event": "table_file_deletion", "file_number": 45} Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.125813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.220650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.220658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.220661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.220664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:05 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:22:05.220667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:22:05 localhost nova_compute[297686]: 2025-10-14 10:22:05.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:05 localhost nova_compute[297686]: 2025-10-14 10:22:05.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:06 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:06 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:06 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:06 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e243 e243: 6 total, 6 up, 6 in Oct 14 06:22:07 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e244 e244: 6 total, 6 up, 6 in Oct 14 06:22:08 localhost openstack_network_exporter[250374]: ERROR 10:22:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:22:08 localhost openstack_network_exporter[250374]: ERROR 10:22:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:22:08 localhost openstack_network_exporter[250374]: ERROR 10:22:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:22:08 localhost openstack_network_exporter[250374]: Oct 14 06:22:08 localhost openstack_network_exporter[250374]: ERROR 10:22:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:22:08 localhost openstack_network_exporter[250374]: Oct 14 06:22:08 localhost openstack_network_exporter[250374]: ERROR 10:22:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:22:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e245 e245: 6 total, 6 up, 6 in Oct 14 06:22:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:22:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:22:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:22:09 localhost podman[338773]: 2025-10-14 10:22:09.78087696 +0000 UTC m=+0.096035333 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:22:09 localhost podman[338773]: 2025-10-14 10:22:09.792944741 +0000 UTC m=+0.108103104 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3) Oct 14 06:22:09 localhost podman[338766]: 2025-10-14 10:22:09.748158029 +0000 UTC m=+0.079687614 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Oct 14 06:22:09 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:22:09 localhost podman[338766]: 2025-10-14 10:22:09.834092761 +0000 UTC m=+0.165622386 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:22:09 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:22:09 localhost podman[338767]: 2025-10-14 10:22:09.874608593 +0000 UTC m=+0.197442962 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:22:09 localhost podman[338767]: 2025-10-14 10:22:09.884040062 +0000 UTC m=+0.206874431 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:22:09 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:22:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:22:10 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2690712529' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:22:10 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:22:10 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2690712529' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:22:10 localhost nova_compute[297686]: 2025-10-14 10:22:10.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:10 localhost nova_compute[297686]: 2025-10-14 10:22:10.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:10 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:10 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:22:10 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:22:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Oct 14 06:22:12 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1229104801' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Oct 14 06:22:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e246 e246: 6 total, 6 up, 6 in Oct 14 06:22:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:22:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:13 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e247 e247: 6 total, 6 up, 6 in Oct 14 06:22:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:15 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e248 e248: 6 total, 6 up, 6 in Oct 14 06:22:15 localhost nova_compute[297686]: 2025-10-14 10:22:15.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:15 localhost nova_compute[297686]: 2025-10-14 10:22:15.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:17 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:22:17 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:22:17 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:22:17 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e249 e249: 6 total, 6 up, 6 in Oct 14 06:22:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:20 localhost nova_compute[297686]: 2025-10-14 10:22:20.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:20 localhost nova_compute[297686]: 2025-10-14 10:22:20.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:22:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:20 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:22:22.463 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:22:22 localhost ovn_metadata_agent[163050]: 2025-10-14 10:22:22.464 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:22:22 localhost nova_compute[297686]: 2025-10-14 10:22:22.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:22 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:22:22.709 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:22:22Z, description=, device_id=b735d9be-4d74-4d80-acd6-e997ff1f9336, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=011a981d-2335-42b8-a54a-18ddb42a63f4, ip_allocation=immediate, mac_address=fa:16:3e:79:50:49, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3558, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:22:22Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:22:22 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:22:22 localhost podman[338842]: 2025-10-14 10:22:22.935173472 +0000 UTC m=+0.066011595 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3) Oct 14 06:22:22 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:22:22 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:22:22 localhost systemd[1]: tmp-crun.3Ff6Xj.mount: Deactivated successfully. Oct 14 06:22:23 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:22:23.211 271987 INFO neutron.agent.dhcp.agent [None req-eaa556ec-4c75-420b-a624-56b71207e191 - - - - - -] DHCP configuration for ports {'011a981d-2335-42b8-a54a-18ddb42a63f4'} is completed#033[00m Oct 14 06:22:23 localhost nova_compute[297686]: 2025-10-14 10:22:23.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:22:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:22:23 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:22:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:24 localhost ovn_metadata_agent[163050]: 2025-10-14 10:22:24.465 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:22:24 localhost systemd[1]: tmp-crun.ly0xXo.mount: Deactivated successfully. Oct 14 06:22:24 localhost podman[338864]: 2025-10-14 10:22:24.765034603 +0000 UTC m=+0.102606546 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:22:24 localhost podman[338865]: 2025-10-14 10:22:24.739660295 +0000 UTC m=+0.073523174 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm) Oct 14 06:22:24 localhost podman[338871]: 2025-10-14 10:22:24.842137246 +0000 UTC m=+0.164605845 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:22:24 localhost podman[338864]: 2025-10-14 10:22:24.850059948 +0000 UTC m=+0.187631861 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009) Oct 14 06:22:24 localhost podman[338871]: 2025-10-14 10:22:24.858192787 +0000 UTC m=+0.180661376 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:22:24 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:22:24 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:22:24 localhost podman[338865]: 2025-10-14 10:22:24.909033155 +0000 UTC m=+0.242896044 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:22:24 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:22:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 e250: 6 total, 6 up, 6 in Oct 14 06:22:25 localhost nova_compute[297686]: 2025-10-14 10:22:25.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:22:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:26 localhost nova_compute[297686]: 2025-10-14 10:22:26.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:28 localhost podman[248187]: time="2025-10-14T10:22:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:22:28 localhost podman[248187]: @ - - [14/Oct/2025:10:22:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:22:28 localhost podman[248187]: @ - - [14/Oct/2025:10:22:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19870 "" "Go-http-client/1.1" Oct 14 06:22:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:29 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:22:29 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:22:29 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:22:30 localhost nova_compute[297686]: 2025-10-14 10:22:30.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:32 localhost nova_compute[297686]: 2025-10-14 10:22:32.270 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:22:32.338 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:22:32Z, description=, device_id=e15859bf-e697-477d-933b-0ed8a123bd13, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=355b3e6c-aabe-4074-9855-e8d5a997f667, ip_allocation=immediate, mac_address=fa:16:3e:b0:69:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3578, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:22:32Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:22:32 localhost systemd[1]: tmp-crun.w7YVwc.mount: Deactivated successfully. Oct 14 06:22:32 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 3 addresses Oct 14 06:22:32 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:22:32 localhost podman[338946]: 2025-10-14 10:22:32.544776103 +0000 UTC m=+0.059893806 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:22:32 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:22:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:22:32.799 271987 INFO neutron.agent.dhcp.agent [None req-3e2c8577-ad89-4f46-9240-9c5958494591 - - - - - -] DHCP configuration for ports {'355b3e6c-aabe-4074-9855-e8d5a997f667'} is completed#033[00m Oct 14 06:22:33 localhost nova_compute[297686]: 2025-10-14 10:22:33.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:22:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:22:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:22:34 localhost systemd[1]: tmp-crun.3auBby.mount: Deactivated successfully. Oct 14 06:22:34 localhost podman[338970]: 2025-10-14 10:22:34.727125261 +0000 UTC m=+0.068081254 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0) Oct 14 06:22:34 localhost podman[338969]: 2025-10-14 10:22:34.742812744 +0000 UTC m=+0.083643644 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:22:34 localhost podman[338969]: 2025-10-14 10:22:34.753003306 +0000 UTC m=+0.093834216 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:22:34 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:22:34 localhost podman[338970]: 2025-10-14 10:22:34.807939426 +0000 UTC m=+0.148895469 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:22:34 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:22:35 localhost nova_compute[297686]: 2025-10-14 10:22:35.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:35 localhost nova_compute[297686]: 2025-10-14 10:22:35.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:36 localhost nova_compute[297686]: 2025-10-14 10:22:36.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:36 localhost nova_compute[297686]: 2025-10-14 10:22:36.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:36 localhost nova_compute[297686]: 2025-10-14 10:22:36.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:22:36 localhost nova_compute[297686]: 2025-10-14 10:22:36.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.158 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.159 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.159 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.159 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:22:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:22:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:22:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:22:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:22:37 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.613 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.630 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.630 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.631 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.648 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.648 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.649 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.649 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:22:37 localhost nova_compute[297686]: 2025-10-14 10:22:37.649 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:22:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:22:38 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1579317652' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.077 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.149 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.149 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.377 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.380 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11136MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.380 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.381 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.470 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.471 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.471 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.528 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:22:38 localhost openstack_network_exporter[250374]: ERROR 10:22:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:22:38 localhost openstack_network_exporter[250374]: ERROR 10:22:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:22:38 localhost openstack_network_exporter[250374]: ERROR 10:22:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:22:38 localhost openstack_network_exporter[250374]: ERROR 10:22:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:22:38 localhost openstack_network_exporter[250374]: Oct 14 06:22:38 localhost openstack_network_exporter[250374]: ERROR 10:22:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:22:38 localhost openstack_network_exporter[250374]: Oct 14 06:22:38 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:22:38 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3169870055' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.992 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:22:38 localhost nova_compute[297686]: 2025-10-14 10:22:38.997 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.045 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.047 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.047 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:22:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.671 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.672 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.672 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:39 localhost nova_compute[297686]: 2025-10-14 10:22:39.672 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:22:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:22:40 localhost nova_compute[297686]: 2025-10-14 10:22:40.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:40 localhost nova_compute[297686]: 2025-10-14 10:22:40.278 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:22:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:22:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:22:40 localhost podman[339142]: 2025-10-14 10:22:40.440176476 +0000 UTC m=+0.093445265 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:22:40 localhost podman[339142]: 2025-10-14 10:22:40.45559942 +0000 UTC m=+0.108868279 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:22:40 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:22:40 localhost nova_compute[297686]: 2025-10-14 10:22:40.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:40 localhost podman[339143]: 2025-10-14 10:22:40.552826911 +0000 UTC m=+0.202631333 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Oct 14 06:22:40 localhost podman[339143]: 2025-10-14 10:22:40.595468822 +0000 UTC m=+0.245273254 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:22:40 localhost ovn_controller[157396]: 2025-10-14T10:22:40Z|00291|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:22:40 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:22:40 localhost podman[339141]: 2025-10-14 10:22:40.618983664 +0000 UTC m=+0.273244553 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:22:40 localhost nova_compute[297686]: 2025-10-14 10:22:40.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:40 localhost podman[339141]: 2025-10-14 10:22:40.709642053 +0000 UTC m=+0.363902972 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:22:40 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:22:40 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:22:40 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:22:40 localhost podman[339213]: 2025-10-14 10:22:40.758598838 +0000 UTC m=+0.072075038 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:22:40 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:22:41 localhost nova_compute[297686]: 2025-10-14 10:22:41.254 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:22:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:22:41 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 19K writes, 73K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 19K writes, 7094 syncs, 2.72 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 12K writes, 42K keys, 12K commit groups, 1.0 writes per commit group, ingest: 29.25 MB, 0.05 MB/s#012Interval WAL: 12K writes, 5489 syncs, 2.22 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 06:22:41 localhost systemd[1]: tmp-crun.UJ9eWA.mount: Deactivated successfully. Oct 14 06:22:43 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:43 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:22:43 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:22:43 localhost ovn_controller[157396]: 2025-10-14T10:22:43Z|00292|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:22:43 localhost nova_compute[297686]: 2025-10-14 10:22:43.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:43 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:22:43 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:22:43 localhost podman[339256]: 2025-10-14 10:22:43.94908298 +0000 UTC m=+0.064556746 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:22:43 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:22:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:45 localhost nova_compute[297686]: 2025-10-14 10:22:45.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:22:45 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 21K writes, 84K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 21K writes, 7338 syncs, 2.89 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 12K writes, 48K keys, 12K commit groups, 1.0 writes per commit group, ingest: 30.61 MB, 0.05 MB/s#012Interval WAL: 12K writes, 5172 syncs, 2.39 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Oct 14 06:22:46 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:46 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:46 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:49 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:22:49 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:22:49 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.825 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.846 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.847 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be1c5bc6-aaa0-4a51-9653-ebc6122327c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.826325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc4b9936-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': 'cdab0d09fdcf924f5836e17605bc46b33a93b2e73eb60eef2924eb32d39a7d52'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.826325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc4bac82-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': 'b9f72eda13504ea02c762dbd37537477a5995bcce23fbd75e3bc92551b2a5260'}]}, 'timestamp': '2025-10-14 10:22:49.847564', '_unique_id': '8625cc4fec3e4eceb42702802c597082'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.849 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.850 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.850 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b353b2c7-6101-45a0-9d6c-62909d6d36ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.850418', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc4c2ec8-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '5aef37f73a817fd6afe90b9c0413adb057940883e061a91b3aae8bd065018ee9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.850418', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc4c40fc-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '8dcba2406745484a7462e451ae6052092f3ddbe76be97ce8d100a77b9952750c'}]}, 'timestamp': '2025-10-14 10:22:49.851315', '_unique_id': '5bf4ee9946f5446fb536d7cb5468322a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.852 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.853 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08ef0d41-83a2-4ab7-93c6-769677474d26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.853917', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc4d5c94-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': 'f53c078b404b163413164ec539673ce99a6392dfed21a7642aa1d3e86c7f8b8e'}]}, 'timestamp': '2025-10-14 10:22:49.858489', '_unique_id': 'a6d1d52173cc4672b448090d583bc374'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.858 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.859 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.859 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '902be25b-aaa9-4e90-ba49-25c6f24a6e20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.859570', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc4d90d8-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': '7dadaf11b16ed4f71e1561cfa0dbcfce9e3a290965dc13920cd150c84de18f54'}]}, 'timestamp': '2025-10-14 10:22:49.859820', '_unique_id': '77e57422b3bf443aa7e55dabae685568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.860 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9565bb9f-c9a1-4d42-9cb5-56059327d297', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.860828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc4dc0bc-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '05d92d4ac667fde1be8bb1fe1ac644c926561e1d7f2d275992052e4179892618'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.860828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc4dc828-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '429a47ad9fcda63ed064b550f08eb54e801a00cd720eaba993f0de5f2917ad92'}]}, 'timestamp': '2025-10-14 10:22:49.861232', '_unique_id': 'fe9bc3cc1bca446a8a21d72376870e5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.861 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14b7f02e-ebf0-4d48-bf05-0e1617efa15b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.862232', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc4df7b2-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': 'bc15dcb9315ae31a1f50d87dc86bba9f55da1c48e4a65f01f98c175b9e637a9a'}]}, 'timestamp': '2025-10-14 10:22:49.862450', '_unique_id': '78bfcf2da6874dc5b7709cf9f71e16f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.862 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.863 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.863 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89f785e9-0cd7-4c67-9067-51b108affce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.863443', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc4e27aa-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': '0e0146d052dbde0461d658887268af9c8bc7cdf985cdfa08322973da5f577952'}]}, 'timestamp': '2025-10-14 10:22:49.863800', '_unique_id': '4257f56736bc42da8a1b2e64a90b54e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.864 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.879 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2534119d-32e5-4179-a2e5-9e0ed00b22dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:22:49.864866', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bc50a1c4-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.072208936, 'message_signature': 'e49aaa164f35a009127da38f8adb63476754240929f34286fa43d4e1f9104606'}]}, 'timestamp': '2025-10-14 10:22:49.879947', '_unique_id': '3ef16d05f59d43cc97002237c3057c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.880 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5a810b6-93c8-4100-95fa-5b7ed276aad3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.881249', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc50df22-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': '120a35e45abb32829e90e16b569c553560ec11fa656bdd1b7578fdbc1bc60fd5'}]}, 'timestamp': '2025-10-14 10:22:49.881503', '_unique_id': 'd0de7235fdfb4684ac5aa4fab42f7bee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.881 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.882 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.882 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1917ab48-d705-4b4a-bb71-ffe0c813d8ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.882690', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc511776-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': 'c30e1dcf62af3af4a5f1f5344fdd41b508041f35f8bd1fdde35418cad1f4f6fa'}]}, 'timestamp': '2025-10-14 10:22:49.882945', '_unique_id': '262e110084034b66857babae3ea15e69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.883 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.884 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.884 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.884 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e370af2a-34a5-416f-b7b4-437bcc646f54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.884197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc5152b8-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '369269a39d798024ce2d6b59c09ba6f0e7f057d2b8d905898fee7b7217857af5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.884197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc515cc2-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': 'c48873fd738b652d892eeff0e22ddb19f4aa454e3590d4cdd0b2a6a0a2cfead7'}]}, 'timestamp': '2025-10-14 10:22:49.884734', '_unique_id': 'e52b03a41ff943cb8236402f9113aaab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18ebe57c-1bc4-4a65-9c8f-217c180ae48c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.886082', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc519c28-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': '3b8f34b141a6b8a565d79b1c1b1afcbb113fedb4b8e94792323ce1cca92ba7f1'}]}, 'timestamp': '2025-10-14 10:22:49.886341', '_unique_id': '4b73c3a178fb44f0a8b9b0875f2e455a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.886 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.887 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.896 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '949edb45-29f0-48e1-a109-e4c947238933', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.887495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc53419a-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.080172581, 'message_signature': '809b50993076754b9c46fe448de49289bb2877572332d334797639fa81134512'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.887495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc534d20-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.080172581, 'message_signature': 'a523710e39fbd856945fe803fb7c5fc524d9eb1435d616f43925554af27862ad'}]}, 'timestamp': '2025-10-14 10:22:49.897434', '_unique_id': '823dcd2604b548ecadb1960e759b4fad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.898 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.899 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.899 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.899 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03e71b4c-12c9-4c20-b9f8-696959462a22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.899090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc5398fc-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.080172581, 'message_signature': '377ffe011df127fe55283eb89f36effb6f31cd5f0d14e6ea4bd04b44354b81fa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.899090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc53a356-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.080172581, 'message_signature': '25c05b746e743aa3efc18c8940c3c66a068f6566cbf18fe197da26c1f3cc0337'}]}, 'timestamp': '2025-10-14 10:22:49.899655', '_unique_id': '670267c638164d7d952cbe8d106a6721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.900 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cbb6a4f-a6e4-455e-b314-8c0db55a415b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.901093', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc53e758-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': 'd9133da6a7fef31a868b23221198862f0a1c08912d2a556111b8329827671f0e'}]}, 'timestamp': '2025-10-14 10:22:49.901391', '_unique_id': 'd2772ec328614dd78f9e00bb16f03c2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.901 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.902 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.902 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 18750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '806f3489-d842-426c-8568-e63e2bffad1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18750000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:22:49.902773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bc542830-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.072208936, 'message_signature': '24268e9f3881bc6be629e2a4eb79de4aa78ae1b01de7bdf2e71884b5ec6a5dce'}]}, 'timestamp': '2025-10-14 10:22:49.903038', '_unique_id': '07b6d4a22b854b4598417051383c3131'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.903 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '541da1c7-6b2a-4c90-a950-2b31d89d4ba0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.904322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc54649e-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.080172581, 'message_signature': '0560aef659ccc356ab1cd80c6073ac85edcf9b4c46a26d869d889d7dd0884ea0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.904322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc546e76-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.080172581, 'message_signature': 'a87c905ad9f908a72f62963f6daa4b0e857c88c409d0ed8681d39f9e345040ad'}]}, 'timestamp': '2025-10-14 10:22:49.904826', '_unique_id': 'c0b8e67a9709484b8508ccee9430786a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.905 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b26d395-2af2-446e-9144-ae0c082266c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.906132', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc54abd4-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': 'e66bca80691192105ad351d91365ccb451d1944cc78ddab1e04c636d4e591afd'}]}, 'timestamp': '2025-10-14 10:22:49.906420', '_unique_id': 'c79dc5767d9c40239906001ffedda4bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.907 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af6dc1d8-ab0d-4053-aab6-f54899667b35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.907713', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc54e946-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '83c6ebfbab81a1284338e25bf78969957ace88872160e3947772c317f42bffcb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.907713', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc54f328-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': 'b58d8290a8de3d0918f65e9e6ca52bd0b09be0b6738a21dd9491720579b93efd'}]}, 'timestamp': '2025-10-14 10:22:49.908227', '_unique_id': 'd80f120ac49f4520aed62dd53c78f236'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ae9c3f8-b1b8-44b7-bf7f-1aa9ca29fc3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:22:49.909549', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': 'bc553284-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.04664086, 'message_signature': 'ee0a03f48a3eb67c28e129be13497a1d039fe5b9ed9cd95c5ca3637fd9c6d897'}]}, 'timestamp': '2025-10-14 10:22:49.909872', '_unique_id': 'd4296493d0ff454c8e8ef50575df57f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.910 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.911 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d642bcc-9c73-464d-a95a-178a28db7efb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:22:49.911423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc557a00-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': '988a8ce34fe4e7234a550ab5283ed39806c4a53d7246d3c10520cf8db64350a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:22:49.911423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc55848c-a8e7-11f0-9707-fa163e99780b', 'monotonic_time': 13386.019030601, 'message_signature': 'a0976c5a7dbcf679d2a98c78cb95cf493e8bcfe81b9f7f2b8aef709954d45f91'}]}, 'timestamp': '2025-10-14 10:22:49.911916', '_unique_id': 'e68911d3b4e74a24bcfadaa1269a5d43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 ERROR oslo_messaging.notify.messaging Oct 14 06:22:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:22:49.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:22:50 localhost nova_compute[297686]: 2025-10-14 10:22:50.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:22:53 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:22:53 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:22:53 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:22:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:22:55 localhost nova_compute[297686]: 2025-10-14 10:22:55.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:22:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:22:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:22:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:22:55 localhost systemd[1]: tmp-crun.QtEz8Y.mount: Deactivated successfully. Oct 14 06:22:55 localhost podman[339278]: 2025-10-14 10:22:55.770106275 +0000 UTC m=+0.103823633 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:22:55 localhost podman[339283]: 2025-10-14 10:22:55.776880053 +0000 UTC m=+0.107378743 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:22:55 localhost podman[339278]: 2025-10-14 10:22:55.782794345 +0000 UTC m=+0.116511683 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Oct 14 06:22:55 localhost podman[339277]: 2025-10-14 10:22:55.742018412 +0000 UTC m=+0.086098679 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:22:55 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:22:55 localhost podman[339283]: 2025-10-14 10:22:55.810950481 +0000 UTC m=+0.141449141 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute) Oct 14 06:22:55 localhost podman[339277]: 2025-10-14 10:22:55.825133257 +0000 UTC m=+0.169213504 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:22:55 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:22:55 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:22:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:22:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:22:57 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:22:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:22:57 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1899281395' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:22:57 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:22:57 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1899281395' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:22:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:22:57.790 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:22:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:22:57.790 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:22:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:22:57.791 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:22:58 localhost podman[248187]: time="2025-10-14T10:22:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:22:58 localhost podman[248187]: @ - - [14/Oct/2025:10:22:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:22:58 localhost podman[248187]: @ - - [14/Oct/2025:10:22:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19873 "" "Go-http-client/1.1" Oct 14 06:22:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:00 localhost nova_compute[297686]: 2025-10-14 10:23:00.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:00 localhost nova_compute[297686]: 2025-10-14 10:23:00.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:00 localhost nova_compute[297686]: 2025-10-14 10:23:00.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:00 localhost nova_compute[297686]: 2025-10-14 10:23:00.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:00 localhost nova_compute[297686]: 2025-10-14 10:23:00.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:00 localhost nova_compute[297686]: 2025-10-14 10:23:00.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:00 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:23:00 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:23:00 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:23:03 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Oct 14 06:23:03 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Oct 14 06:23:03 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Oct 14 06:23:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:05 localhost nova_compute[297686]: 2025-10-14 10:23:05.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:05 localhost nova_compute[297686]: 2025-10-14 10:23:05.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:05 localhost nova_compute[297686]: 2025-10-14 10:23:05.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:05 localhost nova_compute[297686]: 2025-10-14 10:23:05.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:05 localhost nova_compute[297686]: 2025-10-14 10:23:05.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:05 localhost nova_compute[297686]: 2025-10-14 10:23:05.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:23:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:23:05 localhost podman[339340]: 2025-10-14 10:23:05.746753283 +0000 UTC m=+0.085277053 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:23:05 localhost podman[339340]: 2025-10-14 10:23:05.751770857 +0000 UTC m=+0.090294637 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Oct 14 06:23:05 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:23:05 localhost podman[339339]: 2025-10-14 10:23:05.797556245 +0000 UTC m=+0.138036806 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:23:05 localhost podman[339339]: 2025-10-14 10:23:05.811087251 +0000 UTC m=+0.151567842 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:23:05 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:23:06 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:23:06 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:23:06 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:23:08 localhost openstack_network_exporter[250374]: ERROR 10:23:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:23:08 localhost openstack_network_exporter[250374]: ERROR 10:23:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:23:08 localhost openstack_network_exporter[250374]: ERROR 10:23:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:23:08 localhost openstack_network_exporter[250374]: ERROR 10:23:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:23:08 localhost openstack_network_exporter[250374]: Oct 14 06:23:08 localhost openstack_network_exporter[250374]: ERROR 10:23:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:23:08 localhost openstack_network_exporter[250374]: Oct 14 06:23:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:09 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:23:09 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:23:09 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:23:10 localhost nova_compute[297686]: 2025-10-14 10:23:10.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:10 localhost nova_compute[297686]: 2025-10-14 10:23:10.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:10 localhost nova_compute[297686]: 2025-10-14 10:23:10.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:10 localhost nova_compute[297686]: 2025-10-14 10:23:10.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:10 localhost nova_compute[297686]: 2025-10-14 10:23:10.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:10 localhost nova_compute[297686]: 2025-10-14 10:23:10.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:23:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:23:10 localhost systemd[1]: tmp-crun.IV9GKP.mount: Deactivated successfully. Oct 14 06:23:10 localhost podman[339380]: 2025-10-14 10:23:10.75390399 +0000 UTC m=+0.089646148 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:23:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:23:10 localhost podman[339380]: 2025-10-14 10:23:10.791578929 +0000 UTC m=+0.127321097 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0) Oct 14 06:23:10 localhost systemd[1]: tmp-crun.aSuOI2.mount: Deactivated successfully. Oct 14 06:23:10 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:23:10 localhost podman[339379]: 2025-10-14 10:23:10.809446938 +0000 UTC m=+0.147362442 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:23:10 localhost podman[339379]: 2025-10-14 10:23:10.816200597 +0000 UTC m=+0.154116101 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:23:10 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:23:10 localhost podman[339411]: 2025-10-14 10:23:10.880796423 +0000 UTC m=+0.095563389 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:23:10 localhost podman[339411]: 2025-10-14 10:23:10.916325435 +0000 UTC m=+0.131092401 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:23:10 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:23:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:23:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:23:13 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:23:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:14 localhost ovn_controller[157396]: 2025-10-14T10:23:14Z|00293|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.177565) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437395177733, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1619, "num_deletes": 265, "total_data_size": 1697219, "memory_usage": 1729656, "flush_reason": "Manual Compaction"} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437395186519, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1105428, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30774, "largest_seqno": 32388, "table_properties": {"data_size": 1098930, "index_size": 3456, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16203, "raw_average_key_size": 21, "raw_value_size": 1084881, "raw_average_value_size": 1408, "num_data_blocks": 151, "num_entries": 770, "num_filter_entries": 770, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437325, "oldest_key_time": 1760437325, "file_creation_time": 1760437395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 8989 microseconds, and 4868 cpu microseconds. Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.186570) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1105428 bytes OK Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.186604) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.189518) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.189537) EVENT_LOG_v1 {"time_micros": 1760437395189531, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.189567) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1689197, prev total WAL file size 1689521, number of live WAL files 2. Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.190344) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353136' seq:72057594037927935, type:22 .. '6C6F676D0034373731' seq:0, type:0; will stop at (end) Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1079KB)], [48(15MB)] Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437395190385, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 17527980, "oldest_snapshot_seqno": -1} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 14171 keys, 17267546 bytes, temperature: kUnknown Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437395251290, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 17267546, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17187033, "index_size": 43996, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35461, "raw_key_size": 382298, "raw_average_key_size": 26, "raw_value_size": 16946505, "raw_average_value_size": 1195, "num_data_blocks": 1625, "num_entries": 14171, "num_filter_entries": 14171, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.251546) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 17267546 bytes Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.254339) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 287.3 rd, 283.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 15.7 +0.0 blob) out(16.5 +0.0 blob), read-write-amplify(31.5) write-amplify(15.6) OK, records in: 14723, records dropped: 552 output_compression: NoCompression Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.254360) EVENT_LOG_v1 {"time_micros": 1760437395254350, "job": 28, "event": "compaction_finished", "compaction_time_micros": 61008, "compaction_time_cpu_micros": 33886, "output_level": 6, "num_output_files": 1, "total_output_size": 17267546, "num_input_records": 14723, "num_output_records": 14171, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437395254587, "job": 28, "event": "table_file_deletion", "file_number": 50} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437395256269, "job": 28, "event": "table_file_deletion", "file_number": 48} Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.190263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.256370) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.256374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.256376) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.256378) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:23:15 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:23:15.256380) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:23:15 localhost nova_compute[297686]: 2025-10-14 10:23:15.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Oct 14 06:23:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Oct 14 06:23:16 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Oct 14 06:23:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:23:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:23:19 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:23:20 localhost nova_compute[297686]: 2025-10-14 10:23:20.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:22 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:23:22 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:23:22 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:23:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:25 localhost nova_compute[297686]: 2025-10-14 10:23:25.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:25 localhost nova_compute[297686]: 2025-10-14 10:23:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:25 localhost nova_compute[297686]: 2025-10-14 10:23:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:25 localhost nova_compute[297686]: 2025-10-14 10:23:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:25 localhost nova_compute[297686]: 2025-10-14 10:23:25.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:25 localhost nova_compute[297686]: 2025-10-14 10:23:25.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:23:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:23:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:23:26 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow r pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:23:26 localhost podman[339441]: 2025-10-14 10:23:26.756520156 +0000 UTC m=+0.090639739 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, architecture=x86_64, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Oct 14 06:23:26 localhost podman[339441]: 2025-10-14 10:23:26.7911481 +0000 UTC m=+0.125267643 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, config_id=edpm, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7) Oct 14 06:23:26 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:23:26 localhost podman[339440]: 2025-10-14 10:23:26.811082593 +0000 UTC m=+0.146233688 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:23:26 localhost podman[339440]: 2025-10-14 10:23:26.856091727 +0000 UTC m=+0.191242862 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:23:26 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:23:26 localhost podman[339442]: 2025-10-14 10:23:26.864799075 +0000 UTC m=+0.192457309 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:23:26 localhost podman[339442]: 2025-10-14 10:23:26.906344753 +0000 UTC m=+0.234003037 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Oct 14 06:23:26 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:23:28 localhost podman[248187]: time="2025-10-14T10:23:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:23:28 localhost podman[248187]: @ - - [14/Oct/2025:10:23:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:23:28 localhost podman[248187]: @ - - [14/Oct/2025:10:23:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19863 "" "Go-http-client/1.1" Oct 14 06:23:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:29 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Oct 14 06:23:29 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Oct 14 06:23:29 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Oct 14 06:23:30 localhost nova_compute[297686]: 2025-10-14 10:23:30.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:30 localhost nova_compute[297686]: 2025-10-14 10:23:30.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:30 localhost nova_compute[297686]: 2025-10-14 10:23:30.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:30 localhost nova_compute[297686]: 2025-10-14 10:23:30.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:30 localhost nova_compute[297686]: 2025-10-14 10:23:30.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:30 localhost nova_compute[297686]: 2025-10-14 10:23:30.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Oct 14 06:23:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"} : dispatch Oct 14 06:23:33 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7", "mon", "allow r"], "format": "json"}]': finished Oct 14 06:23:33 localhost nova_compute[297686]: 2025-10-14 10:23:33.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:35 localhost nova_compute[297686]: 2025-10-14 10:23:35.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:35 localhost nova_compute[297686]: 2025-10-14 10:23:35.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:36 localhost nova_compute[297686]: 2025-10-14 10:23:36.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:23:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:23:36 localhost podman[339510]: 2025-10-14 10:23:36.760948468 +0000 UTC m=+0.100193223 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:23:36 localhost podman[339510]: 2025-10-14 10:23:36.776219067 +0000 UTC m=+0.115463802 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:23:36 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:23:36 localhost podman[339516]: 2025-10-14 10:23:36.875223812 +0000 UTC m=+0.208711960 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:23:36 localhost podman[339516]: 2025-10-14 10:23:36.886523099 +0000 UTC m=+0.220011247 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Oct 14 06:23:36 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.321 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.321 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.322 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.322 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.767 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.785 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:23:37 localhost nova_compute[297686]: 2025-10-14 10:23:37.786 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:23:38 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:23:38 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:23:38 localhost openstack_network_exporter[250374]: ERROR 10:23:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:23:38 localhost openstack_network_exporter[250374]: ERROR 10:23:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:23:38 localhost openstack_network_exporter[250374]: ERROR 10:23:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:23:38 localhost openstack_network_exporter[250374]: ERROR 10:23:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:23:38 localhost openstack_network_exporter[250374]: Oct 14 06:23:38 localhost openstack_network_exporter[250374]: ERROR 10:23:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:23:38 localhost openstack_network_exporter[250374]: Oct 14 06:23:38 localhost nova_compute[297686]: 2025-10-14 10:23:38.781 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.274 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.274 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.275 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.275 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.276 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:23:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:23:39 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2704901518' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.758 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.893 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:23:39 localhost nova_compute[297686]: 2025-10-14 10:23:39.894 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.109 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.111 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11126MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.112 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.112 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.182 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.183 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.183 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.224 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:23:40 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:23:40 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3343387646' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.716 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.722 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.739 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.742 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:23:40 localhost nova_compute[297686]: 2025-10-14 10:23:40.743 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:23:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Oct 14 06:23:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1,allow rw path=/volumes/_nogroup/958b7b65-e598-4225-a741-d122ebd39d66/c7800ee6-8cf6-4dc0-8d0e-2ed45939ec7d", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7,allow rw pool=manila_data namespace=fsvolumens_958b7b65-e598-4225-a741-d122ebd39d66"]} : dispatch Oct 14 06:23:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1,allow rw path=/volumes/_nogroup/958b7b65-e598-4225-a741-d122ebd39d66/c7800ee6-8cf6-4dc0-8d0e-2ed45939ec7d", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7,allow rw pool=manila_data namespace=fsvolumens_958b7b65-e598-4225-a741-d122ebd39d66"]}]': finished Oct 14 06:23:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Oct 14 06:23:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:23:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:23:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:23:41 localhost systemd[1]: tmp-crun.DTjxlN.mount: Deactivated successfully. Oct 14 06:23:41 localhost podman[339671]: 2025-10-14 10:23:41.750809963 +0000 UTC m=+0.087297535 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:23:41 localhost podman[339671]: 2025-10-14 10:23:41.755488217 +0000 UTC m=+0.091975789 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:23:41 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:23:41 localhost podman[339670]: 2025-10-14 10:23:41.813383978 +0000 UTC m=+0.152385467 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:23:41 localhost podman[339672]: 2025-10-14 10:23:41.859375382 +0000 UTC m=+0.191924563 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible) Oct 14 06:23:41 localhost podman[339670]: 2025-10-14 10:23:41.8778341 +0000 UTC m=+0.216835579 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:23:41 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:23:41 localhost podman[339672]: 2025-10-14 10:23:41.89798759 +0000 UTC m=+0.230536741 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true) Oct 14 06:23:41 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:23:42 localhost nova_compute[297686]: 2025-10-14 10:23:42.743 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:42 localhost nova_compute[297686]: 2025-10-14 10:23:42.743 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:42 localhost nova_compute[297686]: 2025-10-14 10:23:42.743 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:23:42 localhost nova_compute[297686]: 2025-10-14 10:23:42.744 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:23:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Oct 14 06:23:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7"]} : dispatch Oct 14 06:23:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6e43484d-3647-4085-8cab-db9b4f4530f7/09344f24-aa16-4ae2-bf7b-5dc05ab244e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_6e43484d-3647-4085-8cab-db9b4f4530f7"]}]': finished Oct 14 06:23:45 localhost nova_compute[297686]: 2025-10-14 10:23:45.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:45 localhost nova_compute[297686]: 2025-10-14 10:23:45.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:45 localhost nova_compute[297686]: 2025-10-14 10:23:45.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:45 localhost nova_compute[297686]: 2025-10-14 10:23:45.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:45 localhost nova_compute[297686]: 2025-10-14 10:23:45.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:45 localhost nova_compute[297686]: 2025-10-14 10:23:45.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:48 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Oct 14 06:23:48 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Oct 14 06:23:48 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Oct 14 06:23:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:50 localhost nova_compute[297686]: 2025-10-14 10:23:50.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:50 localhost nova_compute[297686]: 2025-10-14 10:23:50.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:23:50 localhost nova_compute[297686]: 2025-10-14 10:23:50.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:23:50 localhost nova_compute[297686]: 2025-10-14 10:23:50.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:50 localhost nova_compute[297686]: 2025-10-14 10:23:50.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:50 localhost nova_compute[297686]: 2025-10-14 10:23:50.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:23:54 localhost ovn_metadata_agent[163050]: 2025-10-14 10:23:54.137 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:23:54 localhost nova_compute[297686]: 2025-10-14 10:23:54.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:54 localhost ovn_metadata_agent[163050]: 2025-10-14 10:23:54.139 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:23:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:23:55 localhost nova_compute[297686]: 2025-10-14 10:23:55.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:55 localhost nova_compute[297686]: 2025-10-14 10:23:55.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:23:56 localhost ceph-osd[32440]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Oct 14 06:23:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:23:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:23:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:23:57 localhost systemd[1]: tmp-crun.SJaIMl.mount: Deactivated successfully. Oct 14 06:23:57 localhost podman[339733]: 2025-10-14 10:23:57.754742069 +0000 UTC m=+0.090598767 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git) Oct 14 06:23:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:23:57.791 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:23:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:23:57.791 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:23:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:23:57.791 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:23:57 localhost podman[339734]: 2025-10-14 10:23:57.800069243 +0000 UTC m=+0.132431834 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible) Oct 14 06:23:57 localhost podman[339734]: 2025-10-14 10:23:57.813256558 +0000 UTC m=+0.145619089 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:23:57 localhost podman[339733]: 2025-10-14 10:23:57.824094742 +0000 UTC m=+0.159951450 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7) Oct 14 06:23:57 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:23:57 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:23:57 localhost podman[339732]: 2025-10-14 10:23:57.903436651 +0000 UTC m=+0.243318373 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:23:57 localhost podman[339732]: 2025-10-14 10:23:57.993209302 +0000 UTC m=+0.333091064 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:23:58 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:23:58 localhost podman[248187]: time="2025-10-14T10:23:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:23:58 localhost podman[248187]: @ - - [14/Oct/2025:10:23:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:23:58 localhost podman[248187]: @ - - [14/Oct/2025:10:23:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19878 "" "Go-http-client/1.1" Oct 14 06:23:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:00 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:00.141 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:24:00 localhost nova_compute[297686]: 2025-10-14 10:24:00.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:24:00 localhost nova_compute[297686]: 2025-10-14 10:24:00.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:24:00 localhost nova_compute[297686]: 2025-10-14 10:24:00.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:24:00 localhost nova_compute[297686]: 2025-10-14 10:24:00.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:24:00 localhost nova_compute[297686]: 2025-10-14 10:24:00.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:24:00 localhost nova_compute[297686]: 2025-10-14 10:24:00.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:05 localhost nova_compute[297686]: 2025-10-14 10:24:05.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:24:05 localhost nova_compute[297686]: 2025-10-14 10:24:05.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:24:05 localhost nova_compute[297686]: 2025-10-14 10:24:05.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:24:05 localhost nova_compute[297686]: 2025-10-14 10:24:05.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:24:05 localhost nova_compute[297686]: 2025-10-14 10:24:05.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:05 localhost nova_compute[297686]: 2025-10-14 10:24:05.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:24:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:24:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:24:07 localhost podman[339795]: 2025-10-14 10:24:07.739328731 +0000 UTC m=+0.083095456 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:24:07 localhost podman[339795]: 2025-10-14 10:24:07.747202773 +0000 UTC m=+0.090969418 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:24:07 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:24:07 localhost podman[339796]: 2025-10-14 10:24:07.792823115 +0000 UTC m=+0.133227797 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible) Oct 14 06:24:07 localhost podman[339796]: 2025-10-14 10:24:07.82093914 +0000 UTC m=+0.161343862 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:24:07 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:24:08 localhost openstack_network_exporter[250374]: ERROR 10:24:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:24:08 localhost openstack_network_exporter[250374]: ERROR 10:24:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:24:08 localhost openstack_network_exporter[250374]: ERROR 10:24:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:24:08 localhost openstack_network_exporter[250374]: ERROR 10:24:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:24:08 localhost openstack_network_exporter[250374]: Oct 14 06:24:08 localhost openstack_network_exporter[250374]: ERROR 10:24:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:24:08 localhost openstack_network_exporter[250374]: Oct 14 06:24:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:10 localhost nova_compute[297686]: 2025-10-14 10:24:10.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:10 localhost nova_compute[297686]: 2025-10-14 10:24:10.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:12 localhost sshd[339835]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:24:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:24:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:24:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:24:12 localhost sshd[339870]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:24:12 localhost podman[339836]: 2025-10-14 10:24:12.763545774 +0000 UTC m=+0.090057100 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:24:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e251 e251: 6 total, 6 up, 6 in Oct 14 06:24:12 localhost podman[339836]: 2025-10-14 10:24:12.775961705 +0000 UTC m=+0.102473041 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:24:12 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:24:12 localhost podman[339838]: 2025-10-14 10:24:12.872489754 +0000 UTC m=+0.192707687 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible) Oct 14 06:24:12 localhost podman[339838]: 2025-10-14 10:24:12.908107149 +0000 UTC m=+0.228325092 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:24:12 localhost systemd[1]: tmp-crun.cneZlo.mount: Deactivated successfully. Oct 14 06:24:12 localhost podman[339837]: 2025-10-14 10:24:12.916444745 +0000 UTC m=+0.239865077 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:24:12 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:24:12 localhost podman[339837]: 2025-10-14 10:24:12.926072462 +0000 UTC m=+0.249492784 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:24:12 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:24:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.823434) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437454823529, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1277, "num_deletes": 251, "total_data_size": 1296952, "memory_usage": 1327960, "flush_reason": "Manual Compaction"} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437454830497, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 845507, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32393, "largest_seqno": 33665, "table_properties": {"data_size": 840126, "index_size": 2660, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13930, "raw_average_key_size": 21, "raw_value_size": 828523, "raw_average_value_size": 1270, "num_data_blocks": 116, "num_entries": 652, "num_filter_entries": 652, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437395, "oldest_key_time": 1760437395, "file_creation_time": 1760437454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 7119 microseconds, and 3687 cpu microseconds. Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.830567) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 845507 bytes OK Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.830589) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.832412) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.832431) EVENT_LOG_v1 {"time_micros": 1760437454832426, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.832455) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1290486, prev total WAL file size 1290486, number of live WAL files 2. Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.833100) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end) Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(825KB)], [51(16MB)] Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437454833156, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 18113053, "oldest_snapshot_seqno": -1} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 14296 keys, 16892446 bytes, temperature: kUnknown Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437454942742, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 16892446, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16811980, "index_size": 43629, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35781, "raw_key_size": 386010, "raw_average_key_size": 27, "raw_value_size": 16570095, "raw_average_value_size": 1159, "num_data_blocks": 1604, "num_entries": 14296, "num_filter_entries": 14296, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.943199) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 16892446 bytes Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.944864) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 165.1 rd, 154.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 16.5 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(41.4) write-amplify(20.0) OK, records in: 14823, records dropped: 527 output_compression: NoCompression Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.944899) EVENT_LOG_v1 {"time_micros": 1760437454944881, "job": 30, "event": "compaction_finished", "compaction_time_micros": 109720, "compaction_time_cpu_micros": 48189, "output_level": 6, "num_output_files": 1, "total_output_size": 16892446, "num_input_records": 14823, "num_output_records": 14296, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437454945166, "job": 30, "event": "table_file_deletion", "file_number": 53} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437454947892, "job": 30, "event": "table_file_deletion", "file_number": 51} Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.832997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.948009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.948016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.948018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.948020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:24:14 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:24:14.948022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:24:15 localhost nova_compute[297686]: 2025-10-14 10:24:15.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:15 localhost nova_compute[297686]: 2025-10-14 10:24:15.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:17 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e252 e252: 6 total, 6 up, 6 in Oct 14 06:24:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e253 e253: 6 total, 6 up, 6 in Oct 14 06:24:20 localhost nova_compute[297686]: 2025-10-14 10:24:20.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:24:20 localhost nova_compute[297686]: 2025-10-14 10:24:20.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:20 localhost nova_compute[297686]: 2025-10-14 10:24:20.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:24:20 localhost nova_compute[297686]: 2025-10-14 10:24:20.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:24:20 localhost nova_compute[297686]: 2025-10-14 10:24:20.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:24:20 localhost nova_compute[297686]: 2025-10-14 10:24:20.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:22 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e254 e254: 6 total, 6 up, 6 in Oct 14 06:24:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:25 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e255 e255: 6 total, 6 up, 6 in Oct 14 06:24:25 localhost ceph-osd[31500]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Oct 14 06:24:25 localhost nova_compute[297686]: 2025-10-14 10:24:25.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:28 localhost podman[248187]: time="2025-10-14T10:24:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:24:28 localhost podman[248187]: @ - - [14/Oct/2025:10:24:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:24:28 localhost podman[248187]: @ - - [14/Oct/2025:10:24:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19875 "" "Go-http-client/1.1" Oct 14 06:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:24:28 localhost podman[339898]: 2025-10-14 10:24:28.752046334 +0000 UTC m=+0.087294295 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS) Oct 14 06:24:28 localhost podman[339899]: 2025-10-14 10:24:28.808310824 +0000 UTC m=+0.137185710 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:24:28 localhost podman[339899]: 2025-10-14 10:24:28.821019955 +0000 UTC m=+0.149894861 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.) Oct 14 06:24:28 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:24:28 localhost podman[339900]: 2025-10-14 10:24:28.86864608 +0000 UTC m=+0.195037619 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute) Oct 14 06:24:28 localhost podman[339900]: 2025-10-14 10:24:28.883100644 +0000 UTC m=+0.209492183 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Oct 14 06:24:28 localhost podman[339898]: 2025-10-14 10:24:28.892771751 +0000 UTC m=+0.228019712 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:24:28 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:24:28 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:24:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:30 localhost nova_compute[297686]: 2025-10-14 10:24:30.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:32 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e256 e256: 6 total, 6 up, 6 in Oct 14 06:24:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e257 e257: 6 total, 6 up, 6 in Oct 14 06:24:35 localhost nova_compute[297686]: 2025-10-14 10:24:35.257 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:35 localhost nova_compute[297686]: 2025-10-14 10:24:35.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:36 localhost nova_compute[297686]: 2025-10-14 10:24:36.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:24:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:24:38 localhost systemd[1]: tmp-crun.krFHqO.mount: Deactivated successfully. Oct 14 06:24:38 localhost podman[339979]: 2025-10-14 10:24:38.058227023 +0000 UTC m=+0.090158644 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Oct 14 06:24:38 localhost podman[339979]: 2025-10-14 10:24:38.067106615 +0000 UTC m=+0.099038306 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:24:38 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:24:38 localhost podman[339978]: 2025-10-14 10:24:38.145255509 +0000 UTC m=+0.180862913 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:24:38 localhost podman[339978]: 2025-10-14 10:24:38.152273165 +0000 UTC m=+0.187880619 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:24:38 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.255 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.328 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.328 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.328 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:24:38 localhost nova_compute[297686]: 2025-10-14 10:24:38.329 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:24:38 localhost openstack_network_exporter[250374]: ERROR 10:24:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:24:38 localhost openstack_network_exporter[250374]: ERROR 10:24:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:24:38 localhost openstack_network_exporter[250374]: ERROR 10:24:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:24:38 localhost openstack_network_exporter[250374]: ERROR 10:24:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:24:38 localhost openstack_network_exporter[250374]: Oct 14 06:24:38 localhost openstack_network_exporter[250374]: ERROR 10:24:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:24:38 localhost openstack_network_exporter[250374]: Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.268 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.284 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.284 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.285 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.302 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.303 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.303 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.303 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.304 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:24:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:39 localhost podman[340169]: Oct 14 06:24:39 localhost podman[340169]: 2025-10-14 10:24:39.635427264 +0000 UTC m=+0.073081648 container create 7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_jones, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553) Oct 14 06:24:39 localhost systemd[1]: Started libpod-conmon-7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328.scope. Oct 14 06:24:39 localhost systemd[1]: Started libcrun container. Oct 14 06:24:39 localhost podman[340169]: 2025-10-14 10:24:39.600666035 +0000 UTC m=+0.038320469 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:24:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:24:39 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/524502989' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:24:39 localhost podman[340169]: 2025-10-14 10:24:39.711157413 +0000 UTC m=+0.148811817 container init 7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_jones, ceph=True, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_CLEAN=True, RELEASE=main) Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.718 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:24:39 localhost podman[340169]: 2025-10-14 10:24:39.726813044 +0000 UTC m=+0.164467438 container start 7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_jones, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, release=553, name=rhceph, ceph=True, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main) Oct 14 06:24:39 localhost podman[340169]: 2025-10-14 10:24:39.727075362 +0000 UTC m=+0.164729806 container attach 7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_jones, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Oct 14 06:24:39 localhost dreamy_jones[340184]: 167 167 Oct 14 06:24:39 localhost systemd[1]: libpod-7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328.scope: Deactivated successfully. Oct 14 06:24:39 localhost podman[340169]: 2025-10-14 10:24:39.733267773 +0000 UTC m=+0.170922157 container died 7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_jones, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vendor=Red Hat, Inc., release=553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux ) Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.789 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.790 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:24:39 localhost podman[340191]: 2025-10-14 10:24:39.829375088 +0000 UTC m=+0.086211143 container remove 7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_jones, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7) Oct 14 06:24:39 localhost systemd[1]: libpod-conmon-7bb91d642cdb9aca6623b0ac1ed1b48f955a1e01866223fdac2b36d71613d328.scope: Deactivated successfully. Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.986 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.988 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11061MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.988 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:24:39 localhost nova_compute[297686]: 2025-10-14 10:24:39.988 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:24:40 localhost podman[340211]: Oct 14 06:24:40 localhost podman[340211]: 2025-10-14 10:24:40.039456568 +0000 UTC m=+0.055867369 container create e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dijkstra, vendor=Red Hat, Inc., version=7, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7) Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.053 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.054 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.054 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:24:40 localhost systemd[1]: Started libpod-conmon-e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8.scope. Oct 14 06:24:40 localhost systemd[1]: Started libcrun container. Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.098 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:24:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eb8e1e132e94d0cef8a42e410e4d7c266abe1c20793cf30f5a6a93a80c5fa01/merged/rootfs supports timestamps until 2038 (0x7fffffff) Oct 14 06:24:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eb8e1e132e94d0cef8a42e410e4d7c266abe1c20793cf30f5a6a93a80c5fa01/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Oct 14 06:24:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eb8e1e132e94d0cef8a42e410e4d7c266abe1c20793cf30f5a6a93a80c5fa01/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Oct 14 06:24:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eb8e1e132e94d0cef8a42e410e4d7c266abe1c20793cf30f5a6a93a80c5fa01/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Oct 14 06:24:40 localhost podman[340211]: 2025-10-14 10:24:40.110667119 +0000 UTC m=+0.127077880 container init e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dijkstra, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, release=553, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, name=rhceph, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public) Oct 14 06:24:40 localhost podman[340211]: 2025-10-14 10:24:40.014937014 +0000 UTC m=+0.031348085 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Oct 14 06:24:40 localhost podman[340211]: 2025-10-14 10:24:40.119701296 +0000 UTC m=+0.136112037 container start e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dijkstra, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=553, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements) Oct 14 06:24:40 localhost podman[340211]: 2025-10-14 10:24:40.119934613 +0000 UTC m=+0.136345414 container attach e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dijkstra, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux ) Oct 14 06:24:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:24:40 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1441824848' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.529 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.538 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.554 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.557 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.557 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:24:40 localhost systemd[1]: tmp-crun.YrM9rY.mount: Deactivated successfully. Oct 14 06:24:40 localhost systemd[1]: var-lib-containers-storage-overlay-0555dcbd7417d077740ad87bd6e37c994787381f12183dd4f95ad3ad0099a672-merged.mount: Deactivated successfully. Oct 14 06:24:40 localhost nova_compute[297686]: 2025-10-14 10:24:40.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:40 localhost sshd[341071]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: [ Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: { Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "available": false, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "ceph_device": false, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "device_id": "QEMU_DVD-ROM_QM00001", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "lsm_data": {}, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "lvs": [], Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "path": "/dev/sr0", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "rejected_reasons": [ Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "Insufficient space (<5GB)", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "Has a FileSystem" Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: ], Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "sys_api": { Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "actuators": null, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "device_nodes": "sr0", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "human_readable_size": "482.00 KB", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "id_bus": "ata", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "model": "QEMU DVD-ROM", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "nr_requests": "2", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "partitions": {}, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "path": "/dev/sr0", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "removable": "1", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "rev": "2.5+", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "ro": "0", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "rotational": "1", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "sas_address": "", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "sas_device_handle": "", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "scheduler_mode": "mq-deadline", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "sectors": 0, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "sectorsize": "2048", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "size": 493568.0, Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "support_discard": "0", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "type": "disk", Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: "vendor": "QEMU" Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: } Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: } Oct 14 06:24:41 localhost vigorous_dijkstra[340226]: ] Oct 14 06:24:41 localhost systemd[1]: libpod-e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8.scope: Deactivated successfully. Oct 14 06:24:41 localhost podman[342164]: 2025-10-14 10:24:41.117008925 +0000 UTC m=+0.038866116 container died e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dijkstra, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git) Oct 14 06:24:41 localhost systemd[1]: var-lib-containers-storage-overlay-7eb8e1e132e94d0cef8a42e410e4d7c266abe1c20793cf30f5a6a93a80c5fa01-merged.mount: Deactivated successfully. Oct 14 06:24:41 localhost podman[342164]: 2025-10-14 10:24:41.16075062 +0000 UTC m=+0.082607761 container remove e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dijkstra, vcs-type=git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, RELEASE=main) Oct 14 06:24:41 localhost systemd[1]: libpod-conmon-e94fc3db1b827761f5e6bd69db2cc99b129149e0a48e0b694876f73c519806b8.scope: Deactivated successfully. Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:24:41 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:42 localhost nova_compute[297686]: 2025-10-14 10:24:42.527 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:42 localhost nova_compute[297686]: 2025-10-14 10:24:42.528 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:42 localhost nova_compute[297686]: 2025-10-14 10:24:42.528 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:42 localhost nova_compute[297686]: 2025-10-14 10:24:42.528 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:24:43 localhost nova_compute[297686]: 2025-10-14 10:24:43.251 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:43 localhost nova_compute[297686]: 2025-10-14 10:24:43.277 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:43 localhost nova_compute[297686]: 2025-10-14 10:24:43.277 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:24:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:24:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:24:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:24:43 localhost podman[342196]: 2025-10-14 10:24:43.776101666 +0000 UTC m=+0.102982198 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible) Oct 14 06:24:43 localhost podman[342196]: 2025-10-14 10:24:43.81980449 +0000 UTC m=+0.146685032 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2) Oct 14 06:24:43 localhost podman[342198]: 2025-10-14 10:24:43.826029141 +0000 UTC m=+0.150124228 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Oct 14 06:24:43 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:24:43 localhost podman[342198]: 2025-10-14 10:24:43.835718439 +0000 UTC m=+0.159813506 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:24:43 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:24:43 localhost nova_compute[297686]: 2025-10-14 10:24:43.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:43 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:43.857 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:24:43 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:43.858 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:24:43 localhost podman[342197]: 2025-10-14 10:24:43.913091578 +0000 UTC m=+0.238745412 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:24:43 localhost podman[342197]: 2025-10-14 10:24:43.92613259 +0000 UTC m=+0.251786494 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:24:43 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:24:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:24:44.289 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:24:44Z, description=, device_id=44f0344d-8a72-431a-aa71-708e800153b9, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d0497429-466d-437a-9099-e1d3ce6f83cc, ip_allocation=immediate, mac_address=fa:16:3e:ee:c2:38, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3849, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:24:44Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:24:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:44 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:24:44 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:24:44 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:24:44 localhost podman[342273]: 2025-10-14 10:24:44.515745141 +0000 UTC m=+0.060937065 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:24:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:24:44 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:24:44.767 271987 INFO neutron.agent.dhcp.agent [None req-2389823f-6f0f-47d5-9b83-1d276048140e - - - - - -] DHCP configuration for ports {'d0497429-466d-437a-9099-e1d3ce6f83cc'} is completed#033[00m Oct 14 06:24:45 localhost nova_compute[297686]: 2025-10-14 10:24:45.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:45 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 e258: 6 total, 6 up, 6 in Oct 14 06:24:45 localhost nova_compute[297686]: 2025-10-14 10:24:45.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:48 localhost nova_compute[297686]: 2025-10-14 10:24:48.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.825 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.849 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.851 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb7bf739-8cba-41a1-bb6e-0e9048301d28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.827153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03d2bfe6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': '82a76f11b95c182224defddcb5a7c72a1cdc819b1da64d029c5b3b06341a6ec9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.827153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03d2db52-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': '5ea1718fb4b2046115cf266b6dd59843196bd16411c2b9e3327cc17b70791d69'}]}, 'timestamp': '2025-10-14 10:24:49.851811', '_unique_id': '8957d7f98842447295a5495f920167c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.853 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.855 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.856 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9ffe46a-34f2-4069-9ba4-b9bf103a6f5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.855587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03d388a4-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': '592789859a226d87248d39931fd88ec100bf96cf24eb4302423db4cf22707309'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.855587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03d39c7c-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'cf00b8336c240db6ca013a6e388d8a1d310a22cf9d2b4fde58b0a10e4d86c3e6'}]}, 'timestamp': '2025-10-14 10:24:49.856699', '_unique_id': '9b2d9313cb1447379adf35190e3308c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.857 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.859 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.863 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0eaf6839-6394-4163-8883-51aca5a50410', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.860042', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03d4c872-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '66e21552e288ef6b7747c02e144522f6326ed530ab960c58935043f397aafe20'}]}, 'timestamp': '2025-10-14 10:24:49.864469', '_unique_id': 'debc8ed290b54e0ca88b40d4105a03d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.865 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.867 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aac8b916-6213-474e-9335-6ff1afcab305', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.867001', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03d542ac-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '418710a6ef2d7127e188cf3c9191dfe3d7d155828894954d918cb859219d5ec4'}]}, 'timestamp': '2025-10-14 10:24:49.867470', '_unique_id': '727b60e5e9d7452b8f0dc309b015d089'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.868 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.869 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.869 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce56a02b-3f69-455f-9902-e145c043e661', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.869770', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03d5d8ac-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '5f5da9c1e7a2e7a8e56dac7db22ac8e85c5ab53f764b42ca361297659f6f405b'}]}, 'timestamp': '2025-10-14 10:24:49.871410', '_unique_id': 'eb4af080e9854b5ab12c1f437e9fb5aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.872 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.874 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.874 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.874 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab912d9c-bb2d-446e-b8f6-ddc863327716', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.874237', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03d65dd6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'f8aa0d567ae810b149b0700493c76443b006f8be45aafc0507dd46b04562e2ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.874237', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03d679d8-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'e78a6c1c72afcc7f244e39a2495ca9f6040b8a5d0d54af972c2e5aece4bae672'}]}, 'timestamp': '2025-10-14 10:24:49.875406', '_unique_id': 'b197ad12895442f1b22c7c0575f1d4c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.876 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.887 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.887 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00f1e577-f501-47ed-ae1c-c5867c9a5862', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.877745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03d85b04-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.070469538, 'message_signature': 'f42588efc52a523e2c6b67061950da233de24bfdfcb42f09f97295bde7c7dc7f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.877745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03d86d38-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.070469538, 'message_signature': 'cd7f5dd0e9c9801a35ca8f5e5ccbd89ba3991acdf53a829228b3e4ed9e600f34'}]}, 'timestamp': '2025-10-14 10:24:49.888182', '_unique_id': 'e57d21d876374dd686b7f506aa7e06ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.889 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.890 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.906 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e754485-bf2a-4192-9571-acea0468759c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:24:49.890416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '03db3f0e-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.098666195, 'message_signature': '9378afba8cde627a217386728a7c955bf4c64be1aff8b39bfe698862560dcc01'}]}, 'timestamp': '2025-10-14 10:24:49.906830', '_unique_id': '7f203bed165a45cab265833677193cf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.908 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.909 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e5750df-53f2-4a5c-ab2c-3505b0bb4691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.909831', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03dbcd48-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '03ee7a5efe5eac160a68eeea83a52f18c98a17aa46cbe4c1ad55608f5de431b5'}]}, 'timestamp': '2025-10-14 10:24:49.910358', '_unique_id': '94fe28f5c0a94ab8aeaa236d55472cf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.911 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.912 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '882b40b2-59b7-49f0-b281-6cce2404305c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.912727', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03dc3d28-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '10e7750b91339759bfe5ac710d0e56c1bb761228b2cd015a18d5cbd832cc10a9'}]}, 'timestamp': '2025-10-14 10:24:49.913203', '_unique_id': '59ddaa149ec04ed1a8429aabf6e1c1c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.914 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.915 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.915 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 19370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e80e6059-fc5a-47b5-a03b-ad683fecc6b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19370000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:24:49.915572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '03dcae66-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.098666195, 'message_signature': '956653d18abc459cfefce27bd988434e6bdc4acb2643fa7e0a87d3a722f4ec7b'}]}, 'timestamp': '2025-10-14 10:24:49.916095', '_unique_id': 'e8cb76bfb43f46eab2b34115172349f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.917 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.918 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ca85937-573b-4439-a4ad-165b8ff5291b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.918260', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03dd1554-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'eae5e343f62ef0d7a135bd87c1ebf026af2ce6dbc36ea734723afdc55fd8c87e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.918260', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03dd2ca6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'f8c6f399ccd4a0bb973278ab6ca3540634fa728ff0f73c1b1491868d7decaf22'}]}, 'timestamp': '2025-10-14 10:24:49.919318', '_unique_id': 'da8fbabacac841ab9feb16ed46b1143b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.920 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.921 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2746317-cd53-42b1-8999-dfb70e766b7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.921515', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03dd9664-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '954ec8aff0e7523d1d239461c38e3bd7011a13a3af14b7cd17ab737c614e509b'}]}, 'timestamp': '2025-10-14 10:24:49.922036', '_unique_id': '0c5bcacaa7e64e1c84197ade35f2e6fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.923 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eed9d3f3-0800-432c-b220-f82ca5774667', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.924370', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03de07b6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': '49fa65be32f045be26fd811ff711d686a306d08b715be6c7f2f9d130381ac7d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.924370', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03de183c-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'cdba5047bd8ae3a189d1b34cfd710c2af9ab0cfd54b27fa2642bced6a7919e05'}]}, 'timestamp': '2025-10-14 10:24:49.925329', '_unique_id': 'a1160157203b4b35974fd3b756526121'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.926 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.927 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cde698e-6f4b-48e7-97c8-de6c64d8ad32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.927547', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03de80d8-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '60e47de5e2e527f5a80b0b1b6a65186c98d1b3f36733b327ecef1dfb4c9f0c42'}]}, 'timestamp': '2025-10-14 10:24:49.928040', '_unique_id': 'f3a934ad2a5c4e26a5c10803fafc4b50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.928 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.930 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fb9813a-54f2-4626-a4d1-560118585924', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.930446', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03def0a4-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': 'd3bc9758124e0b111e6ec14c653381a521560439c9c8e8dda5e96646d7ef992a'}]}, 'timestamp': '2025-10-14 10:24:49.931133', '_unique_id': 'e31c38ad132f4e1fb87b933ad2b39217'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.932 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4dd08d4-206e-46d3-bdae-96bf3a284de7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.933198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03df5be8-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.070469538, 'message_signature': '5f66d7826cfa30615ca3e4e76b07d81beb9f9b6caca6180cbd0fcffb7fde2e30'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.933198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03df6e3a-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.070469538, 'message_signature': '5781efdd92b3ac1afc9aadce9451c8bfb2c6fef40bf5decaa0b85ae6a35699ca'}]}, 'timestamp': '2025-10-14 10:24:49.934088', '_unique_id': 'b4d1a187346946deb8858c4e4ca397d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.936 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.936 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eba6e9d6-bbed-4ff9-8717-0456246f95e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.936476', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03dfdece-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '35bfddab946b475305edcede35ebf577aab33e6f0363f26642b236349358a7a2'}]}, 'timestamp': '2025-10-14 10:24:49.937071', '_unique_id': '6c8b2a253e5d414a87b18fd51510a91b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.938 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.939 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.939 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.939 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c8dd25b-5551-45da-a8f4-c41126e77a73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.939335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03e04b8e-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.070469538, 'message_signature': '1cb25ddacd8f4b01a1a7f14fe0bba3241db202586afea859bdd62f952ac01662'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.939335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03e05d0e-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.070469538, 'message_signature': 'a9a2e60f23c7df59dd5bb5ad246343eb69ee8f3a24559f82737061dfc9c7df5d'}]}, 'timestamp': '2025-10-14 10:24:49.940197', '_unique_id': 'f1fd20c4d70947f4bed0949ca3b1eaff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.941 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.942 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.942 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5df9501-8855-494c-8e09-fa2f7b0fca8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:24:49.942309', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03e0bfa6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': 'e3d663a392b06b3c649c6e5a8b0c5b0a04f1593aa9e8f150729ee2442f612366'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:24:49.942309', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03e0d298-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.019856072, 'message_signature': '3528d835b5ec6b0848f00abc4409dabca809a4865c54219e46ec90d617be8408'}]}, 'timestamp': '2025-10-14 10:24:49.943209', '_unique_id': '6af2c678fa53456885530186633fd331'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.944 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.945 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3d003b5-57a7-4296-b0c4-954ce63d0fe3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:24:49.945422', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '03e13ec2-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13506.052796005, 'message_signature': '0c5a7088d5a32ce56ad919a41dc541075f27e94f476357e493db110ba682e86e'}]}, 'timestamp': '2025-10-14 10:24:49.945950', '_unique_id': '14812f5e324641a586d2af7396e2dfca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:24:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:24:49.946 12 ERROR oslo_messaging.notify.messaging Oct 14 06:24:50 localhost nova_compute[297686]: 2025-10-14 10:24:50.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:50 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:50.859 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:24:53 localhost ovn_controller[157396]: 2025-10-14T10:24:53Z|00294|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:24:53 localhost nova_compute[297686]: 2025-10-14 10:24:53.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:53 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:24:53 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:24:53 localhost podman[342306]: 2025-10-14 10:24:53.757787278 +0000 UTC m=+0.063144692 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:24:53 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:24:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:55 localhost nova_compute[297686]: 2025-10-14 10:24:55.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:24:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:57.792 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:24:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:57.793 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:24:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:24:57.793 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:24:58 localhost podman[248187]: time="2025-10-14T10:24:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:24:58 localhost podman[248187]: @ - - [14/Oct/2025:10:24:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:24:58 localhost podman[248187]: @ - - [14/Oct/2025:10:24:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19877 "" "Go-http-client/1.1" Oct 14 06:24:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:24:59 localhost podman[342329]: 2025-10-14 10:24:59.741119425 +0000 UTC m=+0.076310778 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Oct 14 06:24:59 localhost podman[342330]: 2025-10-14 10:24:59.804642008 +0000 UTC m=+0.136321263 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:24:59 localhost podman[342329]: 2025-10-14 10:24:59.837763507 +0000 UTC m=+0.172954820 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:24:59 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:24:59 localhost podman[342331]: 2025-10-14 10:24:59.859896867 +0000 UTC m=+0.187360432 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:24:59 localhost podman[342331]: 2025-10-14 10:24:59.875120495 +0000 UTC m=+0.202584110 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:24:59 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:24:59 localhost podman[342330]: 2025-10-14 10:24:59.893727998 +0000 UTC m=+0.225407273 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm) Oct 14 06:24:59 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:25:00 localhost nova_compute[297686]: 2025-10-14 10:25:00.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:25:00 localhost nova_compute[297686]: 2025-10-14 10:25:00.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:25:00 localhost nova_compute[297686]: 2025-10-14 10:25:00.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:25:00 localhost nova_compute[297686]: 2025-10-14 10:25:00.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:25:00 localhost nova_compute[297686]: 2025-10-14 10:25:00.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:00 localhost nova_compute[297686]: 2025-10-14 10:25:00.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:25:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:05 localhost nova_compute[297686]: 2025-10-14 10:25:05.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:25:05 localhost nova_compute[297686]: 2025-10-14 10:25:05.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:05 localhost nova_compute[297686]: 2025-10-14 10:25:05.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:25:05 localhost nova_compute[297686]: 2025-10-14 10:25:05.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:25:05 localhost nova_compute[297686]: 2025-10-14 10:25:05.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:25:05 localhost nova_compute[297686]: 2025-10-14 10:25:05.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:25:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:25:08 localhost podman[342394]: 2025-10-14 10:25:08.741768489 +0000 UTC m=+0.084734127 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:25:08 localhost openstack_network_exporter[250374]: ERROR 10:25:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:25:08 localhost openstack_network_exporter[250374]: ERROR 10:25:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:25:08 localhost openstack_network_exporter[250374]: ERROR 10:25:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:25:08 localhost openstack_network_exporter[250374]: ERROR 10:25:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:25:08 localhost openstack_network_exporter[250374]: Oct 14 06:25:08 localhost openstack_network_exporter[250374]: ERROR 10:25:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:25:08 localhost openstack_network_exporter[250374]: Oct 14 06:25:08 localhost podman[342394]: 2025-10-14 10:25:08.778715305 +0000 UTC m=+0.121680893 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:25:08 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:25:08 localhost podman[342395]: 2025-10-14 10:25:08.86244675 +0000 UTC m=+0.201511958 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:25:08 localhost podman[342395]: 2025-10-14 10:25:08.871067575 +0000 UTC m=+0.210132773 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, tcib_managed=true) Oct 14 06:25:08 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:25:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:10 localhost nova_compute[297686]: 2025-10-14 10:25:10.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:25:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:25:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:25:14 localhost podman[342434]: 2025-10-14 10:25:14.747363631 +0000 UTC m=+0.083058966 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:25:14 localhost podman[342434]: 2025-10-14 10:25:14.786178184 +0000 UTC m=+0.121873509 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:25:14 localhost podman[342435]: 2025-10-14 10:25:14.812639688 +0000 UTC m=+0.145268949 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:25:14 localhost podman[342435]: 2025-10-14 10:25:14.850228853 +0000 UTC m=+0.182858104 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:25:14 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:25:14 localhost podman[342436]: 2025-10-14 10:25:14.870129496 +0000 UTC m=+0.197542076 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:25:14 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:25:14 localhost podman[342436]: 2025-10-14 10:25:14.903946966 +0000 UTC m=+0.231359556 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009) Oct 14 06:25:14 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:25:15 localhost nova_compute[297686]: 2025-10-14 10:25:15.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:20 localhost nova_compute[297686]: 2025-10-14 10:25:20.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:25:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:24 localhost ovn_controller[157396]: 2025-10-14T10:25:24Z|00295|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory Oct 14 06:25:25 localhost nova_compute[297686]: 2025-10-14 10:25:25.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:27 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e259 e259: 6 total, 6 up, 6 in Oct 14 06:25:28 localhost podman[248187]: time="2025-10-14T10:25:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:25:28 localhost podman[248187]: @ - - [14/Oct/2025:10:25:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:25:28 localhost podman[248187]: @ - - [14/Oct/2025:10:25:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19872 "" "Go-http-client/1.1" Oct 14 06:25:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:25:30 localhost podman[342498]: 2025-10-14 10:25:30.750042357 +0000 UTC m=+0.086322516 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Oct 14 06:25:30 localhost podman[342498]: 2025-10-14 10:25:30.761247242 +0000 UTC m=+0.097527391 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Oct 14 06:25:30 localhost podman[342499]: 2025-10-14 10:25:30.797198647 +0000 UTC m=+0.130446963 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Oct 14 06:25:30 localhost podman[342499]: 2025-10-14 10:25:30.810174577 +0000 UTC m=+0.143422863 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Oct 14 06:25:30 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:25:30 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:25:30 localhost nova_compute[297686]: 2025-10-14 10:25:30.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:30 localhost systemd[1]: tmp-crun.DUT29l.mount: Deactivated successfully. Oct 14 06:25:30 localhost podman[342497]: 2025-10-14 10:25:30.961841781 +0000 UTC m=+0.301844414 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:25:31 localhost podman[342497]: 2025-10-14 10:25:31.030195912 +0000 UTC m=+0.370198555 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:25:31 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:25:32 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:25:32.855 271987 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:25:32Z, description=, device_id=fd4024eb-1485-429b-9a20-2d8c4acfec9e, device_owner=network:router_gateway, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f163b4ae-4bc4-49c4-bcc6-9b869a6b94b0, ip_allocation=immediate, mac_address=fa:16:3e:33:5b:0d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:36:38Z, description=, dns_domain=, id=c0145816-4627-44f2-af00-ccc9ef0436ed, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=41187b090f3d4818a32baa37ce8a3991, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['bbd40dc1-97d8-4ed3-9d4f-9b3af758b526'], tags=[], tenant_id=41187b090f3d4818a32baa37ce8a3991, updated_at=2025-10-14T08:36:44Z, vlan_transparent=None, network_id=c0145816-4627-44f2-af00-ccc9ef0436ed, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3918, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:25:32Z on network c0145816-4627-44f2-af00-ccc9ef0436ed#033[00m Oct 14 06:25:33 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 2 addresses Oct 14 06:25:33 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:25:33 localhost podman[342580]: 2025-10-14 10:25:33.08435173 +0000 UTC m=+0.066229307 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:25:33 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:25:33 localhost neutron_dhcp_agent[271983]: 2025-10-14 10:25:33.409 271987 INFO neutron.agent.dhcp.agent [None req-6dd2ae52-0fc6-431d-b4ac-cceb0f758d64 - - - - - -] DHCP configuration for ports {'f163b4ae-4bc4-49c4-bcc6-9b869a6b94b0'} is completed#033[00m Oct 14 06:25:33 localhost nova_compute[297686]: 2025-10-14 10:25:33.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:35 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e260 e260: 6 total, 6 up, 6 in Oct 14 06:25:35 localhost nova_compute[297686]: 2025-10-14 10:25:35.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:35 localhost nova_compute[297686]: 2025-10-14 10:25:35.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:36 localhost nova_compute[297686]: 2025-10-14 10:25:36.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:37 localhost nova_compute[297686]: 2025-10-14 10:25:37.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.355 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.356 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.356 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.357 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.690 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.704 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.704 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:25:38 localhost openstack_network_exporter[250374]: ERROR 10:25:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:25:38 localhost openstack_network_exporter[250374]: ERROR 10:25:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:25:38 localhost openstack_network_exporter[250374]: ERROR 10:25:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:25:38 localhost openstack_network_exporter[250374]: ERROR 10:25:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:25:38 localhost openstack_network_exporter[250374]: Oct 14 06:25:38 localhost openstack_network_exporter[250374]: ERROR 10:25:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:25:38 localhost openstack_network_exporter[250374]: Oct 14 06:25:38 localhost ovn_controller[157396]: 2025-10-14T10:25:38Z|00296|binding|INFO|Releasing lport 25c6586a-239c-451b-aac2-e0a3ee5c3145 from this chassis (sb_readonly=0) Oct 14 06:25:38 localhost nova_compute[297686]: 2025-10-14 10:25:38.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:39 localhost systemd[1]: tmp-crun.5Tbbxt.mount: Deactivated successfully. Oct 14 06:25:39 localhost dnsmasq[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/addn_hosts - 1 addresses Oct 14 06:25:39 localhost podman[342617]: 2025-10-14 10:25:39.002864224 +0000 UTC m=+0.069564051 container kill 27e23fba1f10c3118d508631d350793857acd97085c399f85eed727cd1eab7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0145816-4627-44f2-af00-ccc9ef0436ed, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Oct 14 06:25:39 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/host Oct 14 06:25:39 localhost dnsmasq-dhcp[325837]: read /var/lib/neutron/dhcp/c0145816-4627-44f2-af00-ccc9ef0436ed/opts Oct 14 06:25:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:25:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:25:39 localhost podman[342630]: 2025-10-14 10:25:39.103967993 +0000 UTC m=+0.078048891 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:25:39 localhost podman[342630]: 2025-10-14 10:25:39.109481172 +0000 UTC m=+0.083562080 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:25:39 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:25:39 localhost podman[342631]: 2025-10-14 10:25:39.160998987 +0000 UTC m=+0.131100772 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:25:39 localhost podman[342631]: 2025-10-14 10:25:39.191318479 +0000 UTC m=+0.161420264 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Oct 14 06:25:39 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:25:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.280 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.281 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.281 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.281 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.282 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:25:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:25:40 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/909385724' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.709 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.784 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.785 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.985 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.988 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11116MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.988 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:25:40 localhost nova_compute[297686]: 2025-10-14 10:25:40.989 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.081 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.082 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.082 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.125 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:25:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:25:41 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1267917837' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.582 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.589 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.616 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.618 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:25:41 localhost nova_compute[297686]: 2025-10-14 10:25:41.619 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:25:42 localhost nova_compute[297686]: 2025-10-14 10:25:42.615 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:42 localhost nova_compute[297686]: 2025-10-14 10:25:42.616 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:42 localhost nova_compute[297686]: 2025-10-14 10:25:42.617 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:42 localhost nova_compute[297686]: 2025-10-14 10:25:42.617 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:25:42 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:25:42 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:25:42 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e261 e261: 6 total, 6 up, 6 in Oct 14 06:25:44 localhost nova_compute[297686]: 2025-10-14 10:25:44.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:44 localhost nova_compute[297686]: 2025-10-14 10:25:44.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:25:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:25:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:25:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:25:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:25:45 localhost podman[342804]: 2025-10-14 10:25:45.739686832 +0000 UTC m=+0.076815603 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS) Oct 14 06:25:45 localhost podman[342804]: 2025-10-14 10:25:45.78029137 +0000 UTC m=+0.117420131 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:25:45 localhost systemd[1]: tmp-crun.s6HuHZ.mount: Deactivated successfully. Oct 14 06:25:45 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:25:45 localhost podman[342805]: 2025-10-14 10:25:45.806640341 +0000 UTC m=+0.142651598 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:25:45 localhost podman[342805]: 2025-10-14 10:25:45.817038981 +0000 UTC m=+0.153050228 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:25:45 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:25:45 localhost nova_compute[297686]: 2025-10-14 10:25:45.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:45 localhost podman[342806]: 2025-10-14 10:25:45.904275874 +0000 UTC m=+0.239565029 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:25:45 localhost podman[342806]: 2025-10-14 10:25:45.942321283 +0000 UTC m=+0.277610418 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:25:45 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:25:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:25:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2554885512' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:25:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:25:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2554885512' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:25:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e262 e262: 6 total, 6 up, 6 in Oct 14 06:25:50 localhost nova_compute[297686]: 2025-10-14 10:25:50.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:25:50 localhost nova_compute[297686]: 2025-10-14 10:25:50.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:25:50 localhost nova_compute[297686]: 2025-10-14 10:25:50.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5013 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:25:50 localhost nova_compute[297686]: 2025-10-14 10:25:50.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:25:50 localhost nova_compute[297686]: 2025-10-14 10:25:50.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:25:50 localhost nova_compute[297686]: 2025-10-14 10:25:50.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:51 localhost sshd[342865]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:25:51 localhost sshd[342866]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:25:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:25:55 localhost nova_compute[297686]: 2025-10-14 10:25:55.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:25:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:25:57.795 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:25:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:25:57.796 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:25:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:25:57.796 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:25:58 localhost podman[248187]: time="2025-10-14T10:25:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:25:58 localhost podman[248187]: @ - - [14/Oct/2025:10:25:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:25:58 localhost podman[248187]: @ - - [14/Oct/2025:10:25:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19870 "" "Go-http-client/1.1" Oct 14 06:25:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:00 localhost nova_compute[297686]: 2025-10-14 10:26:00.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:26:01 localhost systemd[1]: tmp-crun.VuSQIg.mount: Deactivated successfully. Oct 14 06:26:01 localhost podman[342869]: 2025-10-14 10:26:01.750302163 +0000 UTC m=+0.086010496 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Oct 14 06:26:01 localhost podman[342869]: 2025-10-14 10:26:01.79314083 +0000 UTC m=+0.128849073 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Oct 14 06:26:01 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:26:01 localhost podman[342870]: 2025-10-14 10:26:01.774051794 +0000 UTC m=+0.102044359 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:26:01 localhost podman[342870]: 2025-10-14 10:26:01.853218699 +0000 UTC m=+0.181211194 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Oct 14 06:26:01 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:26:01 localhost podman[342868]: 2025-10-14 10:26:01.803490939 +0000 UTC m=+0.142935386 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:26:01 localhost podman[342868]: 2025-10-14 10:26:01.938201672 +0000 UTC m=+0.277646089 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:26:01 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:26:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:05 localhost nova_compute[297686]: 2025-10-14 10:26:05.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:08 localhost openstack_network_exporter[250374]: ERROR 10:26:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:26:08 localhost openstack_network_exporter[250374]: ERROR 10:26:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:26:08 localhost openstack_network_exporter[250374]: ERROR 10:26:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:26:08 localhost openstack_network_exporter[250374]: ERROR 10:26:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:26:08 localhost openstack_network_exporter[250374]: Oct 14 06:26:08 localhost openstack_network_exporter[250374]: ERROR 10:26:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:26:08 localhost openstack_network_exporter[250374]: Oct 14 06:26:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:26:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:26:09 localhost ovn_controller[157396]: 2025-10-14T10:26:09Z|00297|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Oct 14 06:26:09 localhost podman[342930]: 2025-10-14 10:26:09.735568173 +0000 UTC m=+0.076677260 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:26:09 localhost podman[342930]: 2025-10-14 10:26:09.747034396 +0000 UTC m=+0.088143473 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:26:09 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:26:09 localhost podman[342931]: 2025-10-14 10:26:09.793541235 +0000 UTC m=+0.131980049 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:26:09 localhost podman[342931]: 2025-10-14 10:26:09.8291458 +0000 UTC m=+0.167584594 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent) Oct 14 06:26:09 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:26:10 localhost nova_compute[297686]: 2025-10-14 10:26:10.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e263 e263: 6 total, 6 up, 6 in Oct 14 06:26:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:15 localhost nova_compute[297686]: 2025-10-14 10:26:15.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:26:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:26:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:26:16 localhost systemd[1]: tmp-crun.yjLnLY.mount: Deactivated successfully. Oct 14 06:26:16 localhost podman[342969]: 2025-10-14 10:26:16.761909174 +0000 UTC m=+0.093911969 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:26:16 localhost podman[342969]: 2025-10-14 10:26:16.801024547 +0000 UTC m=+0.133027302 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:26:16 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:26:16 localhost podman[342968]: 2025-10-14 10:26:16.806164365 +0000 UTC m=+0.142769161 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Oct 14 06:26:16 localhost podman[342970]: 2025-10-14 10:26:16.857169133 +0000 UTC m=+0.186235918 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:26:16 localhost podman[342970]: 2025-10-14 10:26:16.866061537 +0000 UTC m=+0.195128322 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:26:16 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:26:16 localhost podman[342968]: 2025-10-14 10:26:16.939268268 +0000 UTC m=+0.275873054 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Oct 14 06:26:16 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:26:17 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e264 e264: 6 total, 6 up, 6 in Oct 14 06:26:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e265 e265: 6 total, 6 up, 6 in Oct 14 06:26:20 localhost nova_compute[297686]: 2025-10-14 10:26:20.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:25 localhost nova_compute[297686]: 2025-10-14 10:26:25.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:27 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e266 e266: 6 total, 6 up, 6 in Oct 14 06:26:28 localhost podman[248187]: time="2025-10-14T10:26:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:26:28 localhost podman[248187]: @ - - [14/Oct/2025:10:26:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:26:28 localhost podman[248187]: @ - - [14/Oct/2025:10:26:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19873 "" "Go-http-client/1.1" Oct 14 06:26:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:30 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e267 e267: 6 total, 6 up, 6 in Oct 14 06:26:30 localhost nova_compute[297686]: 2025-10-14 10:26:30.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:30 localhost nova_compute[297686]: 2025-10-14 10:26:30.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:30 localhost nova_compute[297686]: 2025-10-14 10:26:30.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:26:30 localhost nova_compute[297686]: 2025-10-14 10:26:30.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:26:30 localhost nova_compute[297686]: 2025-10-14 10:26:30.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:26:32 localhost systemd[1]: tmp-crun.iyNWVL.mount: Deactivated successfully. Oct 14 06:26:32 localhost podman[343029]: 2025-10-14 10:26:32.736898969 +0000 UTC m=+0.078412492 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:26:32 localhost podman[343030]: 2025-10-14 10:26:32.795551083 +0000 UTC m=+0.129405890 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41) Oct 14 06:26:32 localhost podman[343030]: 2025-10-14 10:26:32.80942106 +0000 UTC m=+0.143275937 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.) Oct 14 06:26:32 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:26:32 localhost podman[343036]: 2025-10-14 10:26:32.845218751 +0000 UTC m=+0.172977011 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Oct 14 06:26:32 localhost podman[343036]: 2025-10-14 10:26:32.886039486 +0000 UTC m=+0.213797716 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:26:32 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:26:32 localhost podman[343029]: 2025-10-14 10:26:32.898292523 +0000 UTC m=+0.239806076 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0) Oct 14 06:26:32 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:26:33 localhost systemd[1]: tmp-crun.cEpjHe.mount: Deactivated successfully. Oct 14 06:26:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Oct 14 06:26:34 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 4738 writes, 35K keys, 4738 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.04 MB/s#012Cumulative WAL: 4738 writes, 4738 syncs, 1.00 writes per sync, written: 0.05 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2574 writes, 13K keys, 2574 commit groups, 1.0 writes per commit group, ingest: 17.79 MB, 0.03 MB/s#012Interval WAL: 2574 writes, 2574 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 178.7 0.20 0.09 15 0.013 0 0 0.0 0.0#012 L6 1/0 16.11 MB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 6.7 211.0 195.2 1.20 0.63 14 0.085 189K 7195 0.0 0.0#012 Sum 1/0 16.11 MB 0.0 0.2 0.0 0.2 0.3 0.0 0.0 7.7 181.3 192.8 1.39 0.72 29 0.048 189K 7195 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 13.1 189.0 191.6 0.65 0.34 14 0.046 99K 3778 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 0.0 211.0 195.2 1.20 0.63 14 0.085 189K 7195 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 181.5 0.19 0.09 14 0.014 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.034, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.26 GB write, 0.22 MB/s write, 0.25 GB read, 0.21 MB/s read, 1.4 seconds#012Interval compaction: 0.12 GB write, 0.21 MB/s write, 0.12 GB read, 0.20 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c2d5d93350#2 capacity: 304.00 MB usage: 20.02 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000257 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1132,18.81 MB,6.18824%) FilterBlock(29,545.98 KB,0.175391%) IndexBlock(29,688.17 KB,0.221067%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Oct 14 06:26:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:35 localhost nova_compute[297686]: 2025-10-14 10:26:35.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:35 localhost nova_compute[297686]: 2025-10-14 10:26:35.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:35 localhost nova_compute[297686]: 2025-10-14 10:26:35.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:26:35 localhost nova_compute[297686]: 2025-10-14 10:26:35.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:26:35 localhost nova_compute[297686]: 2025-10-14 10:26:35.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:35 localhost nova_compute[297686]: 2025-10-14 10:26:35.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:26:37 localhost nova_compute[297686]: 2025-10-14 10:26:37.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:37 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e268 e268: 6 total, 6 up, 6 in Oct 14 06:26:38 localhost nova_compute[297686]: 2025-10-14 10:26:38.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:38 localhost openstack_network_exporter[250374]: ERROR 10:26:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:26:38 localhost openstack_network_exporter[250374]: ERROR 10:26:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:26:38 localhost openstack_network_exporter[250374]: ERROR 10:26:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:26:38 localhost openstack_network_exporter[250374]: ERROR 10:26:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:26:38 localhost openstack_network_exporter[250374]: Oct 14 06:26:38 localhost openstack_network_exporter[250374]: ERROR 10:26:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:26:38 localhost openstack_network_exporter[250374]: Oct 14 06:26:39 localhost nova_compute[297686]: 2025-10-14 10:26:39.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:39 localhost nova_compute[297686]: 2025-10-14 10:26:39.256 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Oct 14 06:26:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.271 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.271 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.271 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:26:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:26:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:26:40 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e269 e269: 6 total, 6 up, 6 in Oct 14 06:26:40 localhost podman[343094]: 2025-10-14 10:26:40.483090027 +0000 UTC m=+0.137571932 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Oct 14 06:26:40 localhost podman[343094]: 2025-10-14 10:26:40.494029293 +0000 UTC m=+0.148511238 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:26:40 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:26:40 localhost podman[343095]: 2025-10-14 10:26:40.461944747 +0000 UTC m=+0.115165202 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Oct 14 06:26:40 localhost podman[343095]: 2025-10-14 10:26:40.546167147 +0000 UTC m=+0.199387592 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Oct 14 06:26:40 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.565 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.566 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.566 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.566 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:26:40 localhost nova_compute[297686]: 2025-10-14 10:26:40.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.206 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.221 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.222 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.223 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.223 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.237 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.270 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.271 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.272 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.294 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.295 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.295 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.296 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:26:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e270 e270: 6 total, 6 up, 6 in Oct 14 06:26:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:26:41 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/468826846' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.736 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.823 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:26:41 localhost nova_compute[297686]: 2025-10-14 10:26:41.824 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.022 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.023 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11102MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.024 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.024 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.351 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.351 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.352 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.563 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing inventories for resource provider 18c24273-aca2-4f08-be57-3188d558235e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.876 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating ProviderTree inventory for provider 18c24273-aca2-4f08-be57-3188d558235e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.877 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Updating inventory in ProviderTree for provider 18c24273-aca2-4f08-be57-3188d558235e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.891 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing aggregate associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.910 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Refreshing trait associations for resource provider 18c24273-aca2-4f08-be57-3188d558235e, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Oct 14 06:26:42 localhost nova_compute[297686]: 2025-10-14 10:26:42.956 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:26:43 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:26:43 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/735838871' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:26:43 localhost nova_compute[297686]: 2025-10-14 10:26:43.396 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:26:43 localhost nova_compute[297686]: 2025-10-14 10:26:43.404 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:26:43 localhost nova_compute[297686]: 2025-10-14 10:26:43.425 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:26:43 localhost nova_compute[297686]: 2025-10-14 10:26:43.428 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:26:43 localhost nova_compute[297686]: 2025-10-14 10:26:43.429 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:26:44 localhost nova_compute[297686]: 2025-10-14 10:26:44.415 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:44 localhost nova_compute[297686]: 2025-10-14 10:26:44.416 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:44 localhost nova_compute[297686]: 2025-10-14 10:26:44.434 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:26:44 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.616536) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437604616642, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2594, "num_deletes": 260, "total_data_size": 4637064, "memory_usage": 4690368, "flush_reason": "Manual Compaction"} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437604640574, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 3023266, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33670, "largest_seqno": 36259, "table_properties": {"data_size": 3013296, "index_size": 6219, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 23139, "raw_average_key_size": 21, "raw_value_size": 2992579, "raw_average_value_size": 2836, "num_data_blocks": 264, "num_entries": 1055, "num_filter_entries": 1055, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760437455, "oldest_key_time": 1760437455, "file_creation_time": 1760437604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 24170 microseconds, and 9707 cpu microseconds. Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.640665) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 3023266 bytes OK Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.640755) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.642961) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.642985) EVENT_LOG_v1 {"time_micros": 1760437604642978, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.643019) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4625258, prev total WAL file size 4625258, number of live WAL files 2. Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.644270) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133353534' seq:72057594037927935, type:22 .. '7061786F73003133383036' seq:0, type:0; will stop at (end) Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2952KB)], [54(16MB)] Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437604644324, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 19915712, "oldest_snapshot_seqno": -1} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 14809 keys, 18590008 bytes, temperature: kUnknown Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437604728349, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 18590008, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18505339, "index_size": 46552, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37061, "raw_key_size": 398069, "raw_average_key_size": 26, "raw_value_size": 18253810, "raw_average_value_size": 1232, "num_data_blocks": 1720, "num_entries": 14809, "num_filter_entries": 14809, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760436394, "oldest_key_time": 0, "file_creation_time": 1760437604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c006d4a7-7ae8-4cd3-a1b5-95f5bf3c427c", "db_session_id": "ZS6W3A0Q266OBGAU6ARP", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.728947) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 18590008 bytes Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.730830) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 236.5 rd, 220.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 16.1 +0.0 blob) out(17.7 +0.0 blob), read-write-amplify(12.7) write-amplify(6.1) OK, records in: 15351, records dropped: 542 output_compression: NoCompression Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.730863) EVENT_LOG_v1 {"time_micros": 1760437604730848, "job": 32, "event": "compaction_finished", "compaction_time_micros": 84220, "compaction_time_cpu_micros": 52337, "output_level": 6, "num_output_files": 1, "total_output_size": 18590008, "num_input_records": 15351, "num_output_records": 14809, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437604731522, "job": 32, "event": "table_file_deletion", "file_number": 56} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005486733/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760437604734370, "job": 32, "event": "table_file_deletion", "file_number": 54} Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.644170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.734487) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.734494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.734498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.734501) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:26:44 localhost ceph-mon[317114]: rocksdb: (Original Log Time 2025/10/14-10:26:44.734504) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Oct 14 06:26:45 localhost nova_compute[297686]: 2025-10-14 10:26:45.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:45 localhost nova_compute[297686]: 2025-10-14 10:26:45.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:26:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:26:45 localhost nova_compute[297686]: 2025-10-14 10:26:45.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:26:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:26:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:26:47 localhost podman[343265]: 2025-10-14 10:26:47.739461522 +0000 UTC m=+0.073450470 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:26:47 localhost systemd[1]: tmp-crun.Nr0owV.mount: Deactivated successfully. Oct 14 06:26:47 localhost podman[343264]: 2025-10-14 10:26:47.8047555 +0000 UTC m=+0.137504570 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:26:47 localhost podman[343264]: 2025-10-14 10:26:47.814077926 +0000 UTC m=+0.146826986 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009) Oct 14 06:26:47 localhost podman[343265]: 2025-10-14 10:26:47.827166979 +0000 UTC m=+0.161155917 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:26:47 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:26:47 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:26:47 localhost podman[343266]: 2025-10-14 10:26:47.908389147 +0000 UTC m=+0.234784662 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 06:26:47 localhost podman[343266]: 2025-10-14 10:26:47.921005994 +0000 UTC m=+0.247401509 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid) Oct 14 06:26:47 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:26:48 localhost systemd[1]: tmp-crun.gekErY.mount: Deactivated successfully. Oct 14 06:26:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.830 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'name': 'test', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005486733.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '41187b090f3d4818a32baa37ce8a3991', 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'hostId': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.854 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.854 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5692b24-0c84-49a7-ab75-17475ce0f72e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.831547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b59e2c2-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': 'a944cada667558c97d6032fe39b41256e6459adeae43470a5f5b7569f49319fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.831547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b59f7c6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': '9ba9347cd9c40330b734365e636d973c7f8f5097e0153c7f46b02a88c1bc7793'}]}, 'timestamp': '2025-10-14 10:26:49.855396', '_unique_id': '258c4261d1a9430d9f5b33367c8f320d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.856 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.858 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.876 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/cpu volume: 20000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3fad918-5518-49a8-9135-b7652c876f37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20000000000, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:26:49.858325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4b5d5632-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.069076872, 'message_signature': '10359b85b791a4b716f2237306826f21f93ce86af22723c70c5c36e5f9d38c9f'}]}, 'timestamp': '2025-10-14 10:26:49.877537', '_unique_id': 'a348e453abef4c7aa0bc20b820b7c447'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.878 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.880 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.883 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes volume: 8190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67afc89e-25a3-4aca-ab8d-eb89ef8e3f5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8190, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.880588', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b5e648c-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': 'd328c4761d545a015a1f41648f53d6a7488c78079f5326725e956f466514e15d'}]}, 'timestamp': '2025-10-14 10:26:49.884418', '_unique_id': '3eb922833b2b41739d95cc8d4d2b2efc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.885 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.887 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '281b148f-2d7d-4897-ba87-e30408c8791a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.886998', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b5edbce-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': '42bac5c263e8ad1118ef252082f9dce3899facce2646fff686eaf18ed5620c6c'}]}, 'timestamp': '2025-10-14 10:26:49.887464', '_unique_id': '40ee01c0eeb94d228e6f6bab05c6008e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.888 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.889 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd287a2fb-77cf-48a8-bfb5-7f6775f90842', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.889561', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b5f43de-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': 'a7d5857abd2d998fe8fb5bcf9b029a17c8d57413b52aa25f3600e5d75f5e07f1'}]}, 'timestamp': '2025-10-14 10:26:49.890143', '_unique_id': '635a9c414dca4363ae8f2056f15b43a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.891 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.892 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5400b644-a8f1-4f34-8465-0350e9788927', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.892237', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b5fa84c-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': '0390d983d7bb579f51b4d1294bf05f074924186fddce93f140cfdfe6907bcbeb'}]}, 'timestamp': '2025-10-14 10:26:49.892725', '_unique_id': 'c104173f1b214fe980d48045a90e5bd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.893 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.894 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de8dbb4b-53ba-4de1-93dc-e9979b1b13bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.894824', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b600d1e-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': 'ddd2149240b3db5ea6ad1d32eccc6e1b2de564be659165717faa030154e6a775'}]}, 'timestamp': '2025-10-14 10:26:49.895276', '_unique_id': 'a704aaf53cb2414cbdd6c8d7adcd45bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.896 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 1178650666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.897 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.latency volume: 22746151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebafb1d1-1695-4419-8303-b6cf948aca4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1178650666, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.897327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b606f16-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': '2336bae89c51a7ee75a96c9ce5b385505bc1bb00826dc932788502e47c811e03'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22746151, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.897327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b6082b2-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': 'd368c3bd0ae6ef72c83c11014f69ac75e9e9aa78a655654e158d4b0e030a5b24'}]}, 'timestamp': '2025-10-14 10:26:49.898258', '_unique_id': 'a304964b77944dfea6abb0f2d46f8675'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.899 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.900 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19cb40d0-ea05-4614-9adb-774717bb22da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 53, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.900464', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b60ea54-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': '69b4726808e9cc903c05eee930733b7c9994c007a994330403504ea3f9ad0f1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.900464', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b60fa8a-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': 'acae3d203e6546ce0c192ec2d0f7c1ebe2d5c8d219ee9a002eb57dc88764c439'}]}, 'timestamp': '2025-10-14 10:26:49.901322', '_unique_id': '599e68ca99e24fc7969de69df1d0fc40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.902 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.904 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6fe0f84-f6a6-4be9-8f36-da24fa264611', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.904452', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b6186c6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': '21278bf218e0a07c40a4b0741d3050ba8f7dede07c60d26776aa6b6ae9ddba41'}]}, 'timestamp': '2025-10-14 10:26:49.904955', '_unique_id': '9a8e91a7757a403fa05c9fd015b0f208'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.906 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.908 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ca5f81c-e56d-4fe2-a8ce-cf8f05f982a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.907988', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b6213fc-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': 'd7554acc4d0776774fc3fd8068de5bdb4823d81df04a48ebccd0bdb16bb52fd6'}]}, 'timestamp': '2025-10-14 10:26:49.908715', '_unique_id': '7ec13bf0c2d14fb5be508fadd98d44ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.909 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.923 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.924 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b25c620c-31a4-43f3-a960-d73532cc85e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.912040', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b648678-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.104777489, 'message_signature': '16acec9744e97af34d749f6741faf468f1e28211dd25c99907608806c687a5a2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.912040', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b64a0c2-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.104777489, 'message_signature': '223b74e79abc499b74758da73bf9c630b006c75900b76e829b47063e22eee15a'}]}, 'timestamp': '2025-10-14 10:26:49.925344', '_unique_id': 'e9459e2d272445f49c9811fd43b6f03b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.927 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.928 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.929 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '511ee1a6-98db-4448-85a3-469f8e4a7467', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.928855', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b65419e-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.104777489, 'message_signature': 'e520d6eb0218a3941f765b94c6ab1cad9914749fc57eab3e2d5aa4170de847c8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.928855', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b655b48-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.104777489, 'message_signature': 'afd7ecc84e1bcf3e2a866abc6322fa0825a19a61920f9f642a894b634888b2fc'}]}, 'timestamp': '2025-10-14 10:26:49.930105', '_unique_id': 'dd2a002f31534f67912294ec4256fbde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.931 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.933 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63a530c3-5f5e-4267-9c1d-e1354b5fee07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.933209', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b65ec5c-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': 'e7f2021f4cf4e5bbee8420bd8f11af137d025e715e6593779831f3f644dd6fc2'}]}, 'timestamp': '2025-10-14 10:26:49.933939', '_unique_id': 'fcde4bd55d0f4271b9cfb86bf5d79e23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.935 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.936 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.937 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc710023-9ed7-41e9-bf06-ac3146405c75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.936743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b66758c-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.104777489, 'message_signature': 'd55e76d431852f8ad53bf6135943e045a03e3011c4af6696b65f8526be896239'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.936743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b668bda-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.104777489, 'message_signature': '56779fe543b2a2c5809cb0a5a4f7498f790da3814d4037a696b9bb3543af6568'}]}, 'timestamp': '2025-10-14 10:26:49.937979', '_unique_id': '85faaf33d5f64170ab1f05b1dd368092'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.939 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.940 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.940 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/memory.usage volume: 51.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9112665f-2c66-43a1-af42-4ad8bddddd63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.81640625, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'timestamp': '2025-10-14T10:26:49.940583', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4b670aec-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.069076872, 'message_signature': '3eb307360f01e9e5711001f48673cfe4aff456bf73f4259a80dc88a7c5f4eb4b'}]}, 'timestamp': '2025-10-14 10:26:49.941082', '_unique_id': '404e40a392ef45428aad1aafe0c96202'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.942 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.943 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.943 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 454656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.943 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd38f69ad-6a77-4af1-a59f-d43caca65b0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 454656, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.943275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b677220-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': '44149dcddc6239c608e6ca0542f6042f4baf0fa4e94ba3539b8d06bc8f083b93'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.943275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b678530-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': '4fa8ee203d5e75db12e0eaee37480583a857c31460672ecaf0decd87f06827ab'}]}, 'timestamp': '2025-10-14 10:26:49.944198', '_unique_id': '25e4e4e469c542139f4ef5001026a8ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.945 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.946 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.946 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a70268a4-123e-477b-afab-10153da41711', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.946706', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b67f902-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': 'f2959ad7d20a4917cbb901802d8a9a5eeff54e3104127c9c25e439314cb0f35a'}]}, 'timestamp': '2025-10-14 10:26:49.947189', '_unique_id': '97eb663d070f41b2b5c565e3ad31b1bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.948 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.949 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.949 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.949 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53dd9ba3-6211-4831-a217-5326488479c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.949452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b686478-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': 'd624f839991390127721b951d33499a5af5a04ff9fb26f207793443c8bec175e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.949452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b6875e4-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': 'f75becb0f07f81684f6c4d23db18e303d3a9d0b37ceb2e4f909b188892816460'}]}, 'timestamp': '2025-10-14 10:26:49.950356', '_unique_id': '8d11c69af39f463eb539c61424718add'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.951 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 1739521236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/disk.device.read.latency volume: 110324003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f4c0319-237c-48f6-adbb-f50fdaf650ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1739521236, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vda', 'timestamp': '2025-10-14T10:26:49.951833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4b68bca2-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': 'c450b4d09c65c5857a5adb271ff955fde460f49e3bbb3fef3f396aaa8ab49e73'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110324003, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c-vdb', 'timestamp': '2025-10-14T10:26:49.951833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4b68c67a-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.024247962, 'message_signature': '9ae0d0081d8b4358f0974763012c7b0324dbf7c9d6ba2c61e11a4e84df306d9b'}]}, 'timestamp': '2025-10-14 10:26:49.952359', '_unique_id': '3d44451a61644dcca6de533bd17fbe06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.952 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.953 12 DEBUG ceilometer.compute.pollsters [-] 88c4e366-b765-47a6-96bf-f7677f2ce67c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0795f2f-e869-45d2-9e34-0f975d6ab9fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d85e6ce130c46ec855f37147dbb08b4', 'user_name': None, 'project_id': '41187b090f3d4818a32baa37ce8a3991', 'project_name': None, 'resource_id': 'instance-00000002-88c4e366-b765-47a6-96bf-f7677f2ce67c-tap3ec9b060-f4', 'timestamp': '2025-10-14T10:26:49.953690', 'resource_metadata': {'display_name': 'test', 'name': 'tap3ec9b060-f4', 'instance_id': '88c4e366-b765-47a6-96bf-f7677f2ce67c', 'instance_type': 'm1.small', 'host': '4ab9546306a0749c4b6417f31330622670eacba865483f316929dd2e', 'instance_host': 'np0005486733.localdomain', 'flavor': {'id': '36e4c2a8-ca99-4c45-8719-dd5129265531', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d'}, 'image_ref': '0c25fd0b-0cde-472d-a2dc-9e548eac7c4d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:84:5e:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ec9b060-f4'}, 'message_id': '4b6908a6-a8e8-11f0-9707-fa163e99780b', 'monotonic_time': 13626.073333942, 'message_signature': '36b4dd98d8396b04693c2b1fb127bd884500721909acb9e8d62b5a8062f9a19e'}]}, 'timestamp': '2025-10-14 10:26:49.954060', '_unique_id': 'd985234a0b854741ac7e8f05651e02c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging yield Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Oct 14 06:26:49 localhost ceilometer_agent_compute[245517]: 2025-10-14 10:26:49.954 12 ERROR oslo_messaging.notify.messaging Oct 14 06:26:50 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e271 e271: 6 total, 6 up, 6 in Oct 14 06:26:50 localhost nova_compute[297686]: 2025-10-14 10:26:50.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:50 localhost nova_compute[297686]: 2025-10-14 10:26:50.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:26:50 localhost nova_compute[297686]: 2025-10-14 10:26:50.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:26:50 localhost nova_compute[297686]: 2025-10-14 10:26:50.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:26:51 localhost nova_compute[297686]: 2025-10-14 10:26:51.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:51 localhost nova_compute[297686]: 2025-10-14 10:26:51.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:26:51 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e272 e272: 6 total, 6 up, 6 in Oct 14 06:26:53 localhost nova_compute[297686]: 2025-10-14 10:26:53.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:26:53.105 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:26:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:26:53.107 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:26:53 localhost ovn_metadata_agent[163050]: 2025-10-14 10:26:53.109 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:26:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:26:56 localhost nova_compute[297686]: 2025-10-14 10:26:56.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:26:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:26:57.797 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:26:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:26:57.798 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:26:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:26:57.798 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:26:58 localhost podman[248187]: time="2025-10-14T10:26:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:26:58 localhost podman[248187]: @ - - [14/Oct/2025:10:26:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:26:58 localhost podman[248187]: @ - - [14/Oct/2025:10:26:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19881 "" "Go-http-client/1.1" Oct 14 06:26:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:00 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e273 e273: 6 total, 6 up, 6 in Oct 14 06:27:01 localhost nova_compute[297686]: 2025-10-14 10:27:01.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:01 localhost nova_compute[297686]: 2025-10-14 10:27:01.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:27:03 localhost systemd[1]: tmp-crun.8CenWk.mount: Deactivated successfully. Oct 14 06:27:03 localhost podman[343325]: 2025-10-14 10:27:03.754361655 +0000 UTC m=+0.088033118 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container) Oct 14 06:27:03 localhost podman[343325]: 2025-10-14 10:27:03.761704441 +0000 UTC m=+0.095375904 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm) Oct 14 06:27:03 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:27:03 localhost systemd[1]: tmp-crun.3aRvZ9.mount: Deactivated successfully. Oct 14 06:27:03 localhost podman[343326]: 2025-10-14 10:27:03.805214209 +0000 UTC m=+0.132100704 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Oct 14 06:27:03 localhost podman[343326]: 2025-10-14 10:27:03.813911657 +0000 UTC m=+0.140798142 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, tcib_managed=true) Oct 14 06:27:03 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:27:03 localhost podman[343324]: 2025-10-14 10:27:03.891828283 +0000 UTC m=+0.224943859 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Oct 14 06:27:03 localhost podman[343324]: 2025-10-14 10:27:03.941927823 +0000 UTC m=+0.275043369 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Oct 14 06:27:03 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:27:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:06 localhost nova_compute[297686]: 2025-10-14 10:27:06.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:06 localhost nova_compute[297686]: 2025-10-14 10:27:06.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:08 localhost openstack_network_exporter[250374]: ERROR 10:27:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:27:08 localhost openstack_network_exporter[250374]: ERROR 10:27:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:27:08 localhost openstack_network_exporter[250374]: ERROR 10:27:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:27:08 localhost openstack_network_exporter[250374]: ERROR 10:27:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:27:08 localhost openstack_network_exporter[250374]: Oct 14 06:27:08 localhost openstack_network_exporter[250374]: ERROR 10:27:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:27:08 localhost openstack_network_exporter[250374]: Oct 14 06:27:08 localhost nova_compute[297686]: 2025-10-14 10:27:08.889 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:27:10 localhost podman[343389]: 2025-10-14 10:27:10.73791375 +0000 UTC m=+0.079505196 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Oct 14 06:27:10 localhost podman[343389]: 2025-10-14 10:27:10.771195674 +0000 UTC m=+0.112787150 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:27:10 localhost podman[343390]: 2025-10-14 10:27:10.781644805 +0000 UTC m=+0.120073204 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Oct 14 06:27:10 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:27:10 localhost podman[343390]: 2025-10-14 10:27:10.789354522 +0000 UTC m=+0.127782921 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Oct 14 06:27:10 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:27:11 localhost nova_compute[297686]: 2025-10-14 10:27:11.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:12 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e274 e274: 6 total, 6 up, 6 in Oct 14 06:27:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:16 localhost nova_compute[297686]: 2025-10-14 10:27:16.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:16 localhost nova_compute[297686]: 2025-10-14 10:27:16.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:16 localhost nova_compute[297686]: 2025-10-14 10:27:16.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:16 localhost nova_compute[297686]: 2025-10-14 10:27:16.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:16 localhost nova_compute[297686]: 2025-10-14 10:27:16.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:16 localhost nova_compute[297686]: 2025-10-14 10:27:16.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:27:18 localhost podman[343429]: 2025-10-14 10:27:18.738728077 +0000 UTC m=+0.080511086 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, tcib_managed=true) Oct 14 06:27:18 localhost podman[343429]: 2025-10-14 10:27:18.754282776 +0000 UTC m=+0.096065745 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Oct 14 06:27:18 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:27:18 localhost podman[343436]: 2025-10-14 10:27:18.801486458 +0000 UTC m=+0.129615288 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:27:18 localhost podman[343436]: 2025-10-14 10:27:18.808140052 +0000 UTC m=+0.136268892 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Oct 14 06:27:18 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:27:18 localhost podman[343430]: 2025-10-14 10:27:18.851775004 +0000 UTC m=+0.185549857 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Oct 14 06:27:18 localhost podman[343430]: 2025-10-14 10:27:18.886995507 +0000 UTC m=+0.220770360 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Oct 14 06:27:18 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:27:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:20 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 e275: 6 total, 6 up, 6 in Oct 14 06:27:21 localhost nova_compute[297686]: 2025-10-14 10:27:21.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:21 localhost nova_compute[297686]: 2025-10-14 10:27:21.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:21 localhost nova_compute[297686]: 2025-10-14 10:27:21.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:21 localhost nova_compute[297686]: 2025-10-14 10:27:21.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:21 localhost nova_compute[297686]: 2025-10-14 10:27:21.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:21 localhost nova_compute[297686]: 2025-10-14 10:27:21.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:24 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:26 localhost nova_compute[297686]: 2025-10-14 10:27:26.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:26 localhost nova_compute[297686]: 2025-10-14 10:27:26.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:26 localhost nova_compute[297686]: 2025-10-14 10:27:26.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:26 localhost nova_compute[297686]: 2025-10-14 10:27:26.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:26 localhost nova_compute[297686]: 2025-10-14 10:27:26.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:26 localhost nova_compute[297686]: 2025-10-14 10:27:26.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:28 localhost podman[248187]: time="2025-10-14T10:27:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:27:28 localhost podman[248187]: @ - - [14/Oct/2025:10:27:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:27:28 localhost podman[248187]: @ - - [14/Oct/2025:10:27:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19882 "" "Go-http-client/1.1" Oct 14 06:27:29 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:31 localhost ovn_metadata_agent[163050]: 2025-10-14 10:27:31.566 163055 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b6:6b:50', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '6a:59:81:01:bc:8b'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Oct 14 06:27:31 localhost nova_compute[297686]: 2025-10-14 10:27:31.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:31 localhost ovn_metadata_agent[163050]: 2025-10-14 10:27:31.568 163055 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Oct 14 06:27:32 localhost nova_compute[297686]: 2025-10-14 10:27:32.878 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:32 localhost nova_compute[297686]: 2025-10-14 10:27:32.898 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Triggering sync for uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Oct 14 06:27:32 localhost nova_compute[297686]: 2025-10-14 10:27:32.898 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:27:32 localhost nova_compute[297686]: 2025-10-14 10:27:32.899 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:27:32 localhost nova_compute[297686]: 2025-10-14 10:27:32.921 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "88c4e366-b765-47a6-96bf-f7677f2ce67c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:27:34 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:27:34 localhost systemd[1]: tmp-crun.PpLrta.mount: Deactivated successfully. Oct 14 06:27:34 localhost podman[343491]: 2025-10-14 10:27:34.731161329 +0000 UTC m=+0.069869829 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Oct 14 06:27:34 localhost podman[343490]: 2025-10-14 10:27:34.800910854 +0000 UTC m=+0.138783429 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5) Oct 14 06:27:34 localhost podman[343491]: 2025-10-14 10:27:34.812368347 +0000 UTC m=+0.151076957 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64) Oct 14 06:27:34 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:27:34 localhost podman[343490]: 2025-10-14 10:27:34.862904461 +0000 UTC m=+0.200777026 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, config_id=ovn_controller) Oct 14 06:27:34 localhost podman[343492]: 2025-10-14 10:27:34.765718862 +0000 UTC m=+0.098136409 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:27:34 localhost podman[343492]: 2025-10-14 10:27:34.900104444 +0000 UTC m=+0.232521951 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Oct 14 06:27:34 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:27:34 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:27:35 localhost systemd[1]: tmp-crun.B8vARF.mount: Deactivated successfully. Oct 14 06:27:36 localhost nova_compute[297686]: 2025-10-14 10:27:36.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:37 localhost ovn_metadata_agent[163050]: 2025-10-14 10:27:37.571 163055 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9e4b0f79-1220-4c7d-a18d-fa1a88dab362, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Oct 14 06:27:38 localhost nova_compute[297686]: 2025-10-14 10:27:38.277 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:38 localhost openstack_network_exporter[250374]: ERROR 10:27:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:27:38 localhost openstack_network_exporter[250374]: ERROR 10:27:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:27:38 localhost openstack_network_exporter[250374]: ERROR 10:27:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:27:38 localhost openstack_network_exporter[250374]: ERROR 10:27:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:27:38 localhost openstack_network_exporter[250374]: Oct 14 06:27:38 localhost openstack_network_exporter[250374]: ERROR 10:27:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:27:38 localhost openstack_network_exporter[250374]: Oct 14 06:27:39 localhost nova_compute[297686]: 2025-10-14 10:27:39.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:39 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.255 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.283 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.284 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.284 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.284 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Auditing locally available compute resources for np0005486733.localdomain (node: np0005486733.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.285 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:27:41 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:27:41 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3655426957' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.720 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:27:41 localhost systemd[1]: tmp-crun.oNwEZO.mount: Deactivated successfully. Oct 14 06:27:41 localhost podman[343576]: 2025-10-14 10:27:41.759936535 +0000 UTC m=+0.098578723 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Oct 14 06:27:41 localhost podman[343576]: 2025-10-14 10:27:41.794009092 +0000 UTC m=+0.132651230 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.798 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:27:41 localhost nova_compute[297686]: 2025-10-14 10:27:41.799 2 DEBUG nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Oct 14 06:27:41 localhost podman[343575]: 2025-10-14 10:27:41.803920868 +0000 UTC m=+0.144889207 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:27:41 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:27:41 localhost podman[343575]: 2025-10-14 10:27:41.810306424 +0000 UTC m=+0.151274813 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Oct 14 06:27:41 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.017 2 WARNING nova.virt.libvirt.driver [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.019 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Hypervisor/Node resource view: name=np0005486733.localdomain free_ram=11059MB free_disk=41.83695602416992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.020 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.020 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.123 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Instance 88c4e366-b765-47a6-96bf-f7677f2ce67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.123 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.124 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Final resource view: name=np0005486733.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.188 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Oct 14 06:27:42 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Oct 14 06:27:42 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/558245742' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.611 2 DEBUG oslo_concurrency.processutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.618 2 DEBUG nova.compute.provider_tree [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed in ProviderTree for provider: 18c24273-aca2-4f08-be57-3188d558235e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.633 2 DEBUG nova.scheduler.client.report [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Inventory has not changed for provider 18c24273-aca2-4f08-be57-3188d558235e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.635 2 DEBUG nova.compute.resource_tracker [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Compute_service record updated for np0005486733.localdomain:np0005486733.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Oct 14 06:27:42 localhost nova_compute[297686]: 2025-10-14 10:27:42.636 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:27:43 localhost nova_compute[297686]: 2025-10-14 10:27:43.636 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:43 localhost nova_compute[297686]: 2025-10-14 10:27:43.637 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:43 localhost nova_compute[297686]: 2025-10-14 10:27:43.637 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Oct 14 06:27:43 localhost nova_compute[297686]: 2025-10-14 10:27:43.638 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Oct 14 06:27:44 localhost nova_compute[297686]: 2025-10-14 10:27:44.408 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquiring lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Oct 14 06:27:44 localhost nova_compute[297686]: 2025-10-14 10:27:44.409 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Acquired lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Oct 14 06:27:44 localhost nova_compute[297686]: 2025-10-14 10:27:44.409 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Oct 14 06:27:44 localhost nova_compute[297686]: 2025-10-14 10:27:44.409 2 DEBUG nova.objects.instance [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88c4e366-b765-47a6-96bf-f7677f2ce67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Oct 14 06:27:44 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.144 2 DEBUG nova.network.neutron [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updating instance_info_cache with network_info: [{"id": "3ec9b060-f43d-4698-9c76-6062c70911d5", "address": "fa:16:3e:84:5e:e5", "network": {"id": "7d0cd696-bdd7-4e70-9512-eb0d23640314", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "41187b090f3d4818a32baa37ce8a3991", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ec9b060-f4", "ovs_interfaceid": "3ec9b060-f43d-4698-9c76-6062c70911d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.161 2 DEBUG oslo_concurrency.lockutils [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Releasing lock "refresh_cache-88c4e366-b765-47a6-96bf-f7677f2ce67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.162 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] [instance: 88c4e366-b765-47a6-96bf-f7677f2ce67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.162 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.163 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.163 2 DEBUG nova.compute.manager [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:45 localhost nova_compute[297686]: 2025-10-14 10:27:45.256 2 DEBUG oslo_service.periodic_task [None req-73790bc8-c422-4190-8b1f-0d1b272a82ce - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Oct 14 06:27:45 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:46 localhost nova_compute[297686]: 2025-10-14 10:27:46.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:46 localhost nova_compute[297686]: 2025-10-14 10:27:46.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:46 localhost nova_compute[297686]: 2025-10-14 10:27:46.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:46 localhost nova_compute[297686]: 2025-10-14 10:27:46.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:46 localhost nova_compute[297686]: 2025-10-14 10:27:46.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:46 localhost nova_compute[297686]: 2025-10-14 10:27:46.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:46 localhost sshd[343785]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:27:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Oct 14 06:27:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3059880668' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Oct 14 06:27:48 localhost ceph-mon[317114]: mon.np0005486733@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Oct 14 06:27:48 localhost ceph-mon[317114]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3059880668' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Oct 14 06:27:49 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:27:49 localhost ceph-mon[317114]: from='mgr.44286 172.18.0.106:0/3162921916' entity='mgr.np0005486731.swasqz' Oct 14 06:27:49 localhost podman[343788]: 2025-10-14 10:27:49.76620588 +0000 UTC m=+0.096761456 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Oct 14 06:27:49 localhost podman[343789]: 2025-10-14 10:27:49.810468661 +0000 UTC m=+0.137891361 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Oct 14 06:27:49 localhost podman[343789]: 2025-10-14 10:27:49.823202292 +0000 UTC m=+0.150625032 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:27:49 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:27:49 localhost podman[343788]: 2025-10-14 10:27:49.864166162 +0000 UTC m=+0.194721768 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009) Oct 14 06:27:49 localhost podman[343790]: 2025-10-14 10:27:49.872299982 +0000 UTC m=+0.196920386 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}) Oct 14 06:27:49 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:27:49 localhost podman[343790]: 2025-10-14 10:27:49.886157879 +0000 UTC m=+0.210778283 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true) Oct 14 06:27:49 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:27:51 localhost nova_compute[297686]: 2025-10-14 10:27:51.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:51 localhost nova_compute[297686]: 2025-10-14 10:27:51.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:51 localhost nova_compute[297686]: 2025-10-14 10:27:51.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:51 localhost nova_compute[297686]: 2025-10-14 10:27:51.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:51 localhost nova_compute[297686]: 2025-10-14 10:27:51.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:51 localhost nova_compute[297686]: 2025-10-14 10:27:51.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:54 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:27:56 localhost nova_compute[297686]: 2025-10-14 10:27:56.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:56 localhost nova_compute[297686]: 2025-10-14 10:27:56.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:27:56 localhost nova_compute[297686]: 2025-10-14 10:27:56.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:27:56 localhost nova_compute[297686]: 2025-10-14 10:27:56.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:56 localhost nova_compute[297686]: 2025-10-14 10:27:56.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:27:56 localhost nova_compute[297686]: 2025-10-14 10:27:56.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:27:57 localhost sshd[343847]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:27:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:27:57.799 163055 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Oct 14 06:27:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:27:57.800 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Oct 14 06:27:57 localhost ovn_metadata_agent[163050]: 2025-10-14 10:27:57.800 163055 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Oct 14 06:27:57 localhost systemd-logind[760]: New session 74 of user zuul. Oct 14 06:27:57 localhost systemd[1]: Started Session 74 of User zuul. Oct 14 06:27:58 localhost python3[343869]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-2d7c-d7b0-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 14 06:27:58 localhost podman[248187]: time="2025-10-14T10:27:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Oct 14 06:27:58 localhost podman[248187]: @ - - [14/Oct/2025:10:27:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147497 "" "Go-http-client/1.1" Oct 14 06:27:58 localhost podman[248187]: @ - - [14/Oct/2025:10:27:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19882 "" "Go-http-client/1.1" Oct 14 06:27:59 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:28:01 localhost nova_compute[297686]: 2025-10-14 10:28:01.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:01 localhost nova_compute[297686]: 2025-10-14 10:28:01.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:01 localhost nova_compute[297686]: 2025-10-14 10:28:01.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:28:01 localhost nova_compute[297686]: 2025-10-14 10:28:01.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:01 localhost nova_compute[297686]: 2025-10-14 10:28:01.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:28:01 localhost nova_compute[297686]: 2025-10-14 10:28:01.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:02 localhost systemd[1]: session-74.scope: Deactivated successfully. Oct 14 06:28:02 localhost systemd-logind[760]: Session 74 logged out. Waiting for processes to exit. Oct 14 06:28:02 localhost systemd-logind[760]: Removed session 74. Oct 14 06:28:04 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f. Oct 14 06:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917. Oct 14 06:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d. Oct 14 06:28:05 localhost systemd[1]: tmp-crun.MCfJ7r.mount: Deactivated successfully. Oct 14 06:28:05 localhost podman[343873]: 2025-10-14 10:28:05.75400031 +0000 UTC m=+0.094828187 container health_status 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.license=GPLv2) Oct 14 06:28:05 localhost systemd[1]: tmp-crun.LDEJAX.mount: Deactivated successfully. Oct 14 06:28:05 localhost podman[343874]: 2025-10-14 10:28:05.776641766 +0000 UTC m=+0.109374705 container health_status 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350) Oct 14 06:28:05 localhost podman[343878]: 2025-10-14 10:28:05.831008348 +0000 UTC m=+0.157561147 container health_status 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Oct 14 06:28:05 localhost podman[343874]: 2025-10-14 10:28:05.841338865 +0000 UTC m=+0.174071724 container exec_died 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41) Oct 14 06:28:05 localhost systemd[1]: 799ae74eb47b49e56b15e6d8a77dd3eeeca8208e775a683b2f1fad385e760917.service: Deactivated successfully. Oct 14 06:28:05 localhost podman[343873]: 2025-10-14 10:28:05.855106549 +0000 UTC m=+0.195934426 container exec_died 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible) Oct 14 06:28:05 localhost systemd[1]: 1db9ee4492090c899b9513063971bd6265aee39a42e22af85f73d01746cfc25f.service: Deactivated successfully. Oct 14 06:28:05 localhost podman[343878]: 2025-10-14 10:28:05.871279136 +0000 UTC m=+0.197831915 container exec_died 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image) Oct 14 06:28:05 localhost systemd[1]: 89f60abc3ba3b55bdd6ab3292515e6dc47ab9bc2707f457a841da8dc00ea527d.service: Deactivated successfully. Oct 14 06:28:06 localhost nova_compute[297686]: 2025-10-14 10:28:06.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:28:06 localhost nova_compute[297686]: 2025-10-14 10:28:06.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:28:08 localhost openstack_network_exporter[250374]: ERROR 10:28:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Oct 14 06:28:08 localhost openstack_network_exporter[250374]: ERROR 10:28:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:28:08 localhost openstack_network_exporter[250374]: ERROR 10:28:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Oct 14 06:28:08 localhost openstack_network_exporter[250374]: ERROR 10:28:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Oct 14 06:28:08 localhost openstack_network_exporter[250374]: Oct 14 06:28:08 localhost openstack_network_exporter[250374]: ERROR 10:28:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Oct 14 06:28:08 localhost openstack_network_exporter[250374]: Oct 14 06:28:09 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:28:11 localhost nova_compute[297686]: 2025-10-14 10:28:11.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:11 localhost nova_compute[297686]: 2025-10-14 10:28:11.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:11 localhost nova_compute[297686]: 2025-10-14 10:28:11.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:28:11 localhost nova_compute[297686]: 2025-10-14 10:28:11.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:11 localhost nova_compute[297686]: 2025-10-14 10:28:11.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:28:11 localhost nova_compute[297686]: 2025-10-14 10:28:11.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041. Oct 14 06:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857. Oct 14 06:28:12 localhost podman[343934]: 2025-10-14 10:28:12.753393662 +0000 UTC m=+0.089368079 container health_status 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:28:12 localhost podman[343934]: 2025-10-14 10:28:12.762250705 +0000 UTC m=+0.098225212 container exec_died 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Oct 14 06:28:12 localhost systemd[1]: 0fe94e4154ea83c90207f45a570fbe942906f662cc34350d9807e3e9a2de5041.service: Deactivated successfully. Oct 14 06:28:12 localhost podman[343935]: 2025-10-14 10:28:12.798980064 +0000 UTC m=+0.130986969 container health_status 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:28:12 localhost podman[343935]: 2025-10-14 10:28:12.809509808 +0000 UTC m=+0.141516753 container exec_died 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Oct 14 06:28:12 localhost systemd[1]: 28c543a246be826aa024b2cfeec10e1cfee11fa9dc7fb2480cd6d7d513514857.service: Deactivated successfully. Oct 14 06:28:14 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:28:16 localhost nova_compute[297686]: 2025-10-14 10:28:16.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:16 localhost nova_compute[297686]: 2025-10-14 10:28:16.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:16 localhost nova_compute[297686]: 2025-10-14 10:28:16.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:28:16 localhost nova_compute[297686]: 2025-10-14 10:28:16.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:16 localhost nova_compute[297686]: 2025-10-14 10:28:16.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:28:16 localhost nova_compute[297686]: 2025-10-14 10:28:16.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:19 localhost ceph-mon[317114]: mon.np0005486733@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Oct 14 06:28:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad. Oct 14 06:28:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec. Oct 14 06:28:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29. Oct 14 06:28:20 localhost podman[343978]: 2025-10-14 10:28:20.744251533 +0000 UTC m=+0.079165895 container health_status aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Oct 14 06:28:20 localhost podman[343977]: 2025-10-14 10:28:20.800496302 +0000 UTC m=+0.138087787 container health_status 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Oct 14 06:28:20 localhost podman[343977]: 2025-10-14 10:28:20.81309261 +0000 UTC m=+0.150684065 container exec_died 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Oct 14 06:28:20 localhost systemd[1]: 02158cd72558131fa0e89bbb3b46a9807bed5c3e943b8cbc4d82fc8b46c842ad.service: Deactivated successfully. Oct 14 06:28:20 localhost podman[343978]: 2025-10-14 10:28:20.832828007 +0000 UTC m=+0.167742429 container exec_died aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Oct 14 06:28:20 localhost systemd[1]: aaae23a066b4da9e009040084e049c0d04466d3ed32f932d6b8cd26dc04f44ec.service: Deactivated successfully. Oct 14 06:28:20 localhost podman[343979]: 2025-10-14 10:28:20.892988337 +0000 UTC m=+0.225621929 container health_status fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Oct 14 06:28:20 localhost podman[343979]: 2025-10-14 10:28:20.906067649 +0000 UTC m=+0.238701251 container exec_died fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d) Oct 14 06:28:20 localhost systemd[1]: fe891796c3843bedd071595927ad054fe034c4b8110ef87d9d39065d90cb8b29.service: Deactivated successfully. Oct 14 06:28:21 localhost nova_compute[297686]: 2025-10-14 10:28:21.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:21 localhost nova_compute[297686]: 2025-10-14 10:28:21.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Oct 14 06:28:21 localhost nova_compute[297686]: 2025-10-14 10:28:21.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Oct 14 06:28:21 localhost nova_compute[297686]: 2025-10-14 10:28:21.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:21 localhost nova_compute[297686]: 2025-10-14 10:28:21.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Oct 14 06:28:21 localhost nova_compute[297686]: 2025-10-14 10:28:21.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Oct 14 06:28:21 localhost sshd[344038]: main: sshd: ssh-rsa algorithm is disabled Oct 14 06:28:21 localhost systemd-logind[760]: New session 75 of user zuul. Oct 14 06:28:21 localhost systemd[1]: Started Session 75 of User zuul.